Inception Hosting – Shared hosting starting around $1/year and more!

Anthony from over at Inception Hosting is back after quite a few years of silence. They have submitted three various shared hosting plans that all seem like awesome deals hosted out of their London, UK location! One of the key features to point out here is that these plans include Cloudflare Railgun ($ 200 value) at a VERY low price!

You can find their ToS/Legal Docs here. They accept PayPal, Debit/Credit cards via Stripe and Crypto via Coingate as payment methods.

Here’s what they had to say: 

“Inception Hosting Limited was established in January 2011, the company was setup in response to what seemed to be a generally mediocre level of service in the budget side of the industry.

Now in 3 locations around the world the primary base of the company is in budget KVM and OpenVZ VPS services, Inception Hosting is also now official CloudFlare partner so is able to offer excellent value cPanel based shared hosting in London with plans to expand in to the Netherlands and the USA in the near term future.”

Here are the offers: 

BASIC

  • 1 Domain
  • 1GB SSD Cached RAID-10 Storage
  • 100GB Data Transfer
  • 1 MySQL/Maria Database
  • 5 Email Accounts
  • Litespeed/CloudLinux
  • JetBackup/Cloudflare Railgun
  • Softaculous/Attracta SEO Tools
  • SpamScan/DDoS Protection
  • PHP 7/MariaDB
  • Daily Offsite Backups
  • Free SSL
  • €1.00 p/year
  • [ORDER]

GOLD

  • 10 Addon Domains
  • 2GB SSD Cached RAID-10 Storage
  • Unmetered Data Transfer
  • Unlimited MySQL/Maria Databases
  • Unlimited Email Accounts
  • Litespeed/CloudLinux
  • JetBackup/Cloudflare Railgun
  • Softaculous/Attracta SEO Tools
  • SpamScan/DDoS Protection
  • PHP 7/MariaDB
  • Daily Offsite Backups
  • Free SSL
  • €5.00 p/year
  • [ORDER]

PLATINUM

  • 20 Addon Domains
  • 5GB SSD Cached RAID-10 Storage
  • Unmetered Data Transfer
  • Unlimited MySQL/Maria Databases
  • Unlimited Email Accounts
  • Litespeed/CloudLinux
  • JetBackup/Cloudflare Railgun
  • Softaculous/Attracta SEO Tools
  • SpamScan/DDoS Protection
  • PHP 7/MariaDB
  • Daily Offsite Backups
  • Free SSL
  • €7.00 p/year
  • [ORDER]

NETWORK INFO:

Datacenter: Clouvider – London
Looking glass: http://lg.clouvider.net

Please let us know if you have any questions/comments and enjoy!

The post Inception Hosting – Shared hosting starting around $ 1/year and more! appeared first on Low End Box.

Why does the BFR (Bradley, Fayyad and Reina) algorithm assume clusters to be normally distributed around its centroid?

I’m following a course on data mining based on the lectures from Stanford University and the book Mining of massive datasets.

On the topic of clustering, the BFR algorithm is explained with this video.
I understand how the algorithm works, but I am unclear on the reason why the algorithm makes the strong assumption that each cluster is normally distributed around a centroid in Euclidian space.

The video explains that the assumption implies that clusters look like axis-aligned ellipses, which is understandable as the dimensions must be independent.
I’ve watched the video a few times, and read the section in the book (freely downloadable using the first link) on pages 257-259, but I’m unable to grasp why that assumption is made, and why it has to be made.

Could someone explain this for me?

Missing Icons After Playing around in Ubuntu 18.04 LTS

I have been playing around themes and icons in Ubuntu for a while now. Due to low disk space, I decided to remove unwanted themes and Icons. /usr/share/icons$ sudo rm -rf Adwaita etc I use Numix Circle for Icons, Canta Dark theme and Flat-Remix-Darkest-fullPanel for Shell theme. I control all these using Gnome Tweaks tool.

After all this everything is fine, except ubuntu logo is missing in Settings>About. Also many of the default icons are missing when I use any other icon theme provided by Ubuntu, even after reinstalling the uninstalled ones. What I am missing here? I did not reinstall some themes. Do you think themes would have these icons?

enter image description here

C++ wrapper around uniform mt19937 SequenceContainer

With the following interface in mind

EasyRandom<unsigned int> prng(a, b); auto x = prng();   // scalar auto v = prng(10); // vector 

I wrote the following class:

// https://en.cppreference.com/w/cpp/numeric/random/uniform_int_distribution template <typename T = unsigned int> class EasyRandom { private:   std::random_device rd;   std::unique_ptr<std::mt19937> gen;   std::unique_ptr<std::uniform_int_distribution<T>> dist;  public:   EasyRandom(T a, T b)   {     gen = std::make_unique<std::mt19937>(rd());     dist = std::make_unique<std::uniform_int_distribution<T>>(a, b);   }    T operator()() { return (*dist)(*gen); }    std::vector<T> operator()(size_t n)   {     std::vector<T> v;     for (; n > 0; v.push_back(operator()()), --n);     return v;   } }; 

I also have a few specific questions:

  1. Is there a way to instantiate EasyRandom without the use of pointers?
  2. Is it possible to change operator()(size_t n) to return a user-specified SequenceContainer (e.g. list, deque) instead of hard-coding it to std::vector?

I have imported around 90 million unique URLs as new targets and they get deleted with these setting

I have crawled myself quite a few URLs and made them unique

I import them into a fresh project

After a while target URLs of the project get deleted

Here my settings

Are there a maximum amount of URLs file size that GSA ser supports?

Imported target urls file is 6.13 GB

The import process is being success because i see target urls file in projects folder

The GSA ser starts with around fetching 16k urls

And today when i wake up after like 10 hours i see target urls in project folder got reset and there are only several megabytes target urls

I am trying again

the fresh project starts as 

Loaded 16777 URLs from imported sites

And damn target urls already got reset

Keyboard shortcut to transpose characters around cursor

In Emacs (and MacOS) typing control+t will transpose the characters around the cursor in any input field. For example if I type

helol<c-t>

hello

results. Is there a way to do this in Ubuntu? I’m imagining a solution using something like sxhkd that involves copying the previous 2 characters, deleting them, and pasting a transposed version, but I imagine this would look distracting and take a little while to execute.

I already have Emacs keys enabled system wide but apparently this is not one of the shortcuts it offers.

Whiten black contours around a skewed image

I have this image:

enter image description here

I want to whiten the black contours (borders) around it without affecting the image content. Here is the code I used:

import cv2 import numpy as np import shapely.geometry as shageo   img = cv2.imread('filename.jpg')  # get the gray image and do binaryzation gray = cv2.cvtColor(img, cv2.COLOR_RGB2GRAY) gray[gray < 20] = 0 gray[gray > 0] = 255  # get the largest boundry of the binary image to locate the target contours, _ = cv2.findContours(gray, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE) rect = cv2.minAreaRect(contours[0]) box = cv2.boxPoints(rect) box = np.int0(box)  poly = shageo.Polygon(box) h, w = img.shape[:2] ind = np.zeros((h, w), np.bool)  # check if the point is inside the target or not for i in range(h):     for j in range(w):         p = shageo.Point(j, i)         if not p.within(poly):             ind[i, j] = True  # whiten the outside points img[ind] = (255, 255, 255) cv2.imwrite('result.jpg', img) 

Here is the result: enter image description here

As you see, the code works fine, but it’s very slow because of the for loops.

Any suggestion how to avoid the for loops or to make them faster?

Can BPP be bounded around any constant other than 1/2?

A language $ L$ is in BPP if there exists a randomised TM such that it outputs a correct answer with probability at least $ 1/2+1/p(n)$ for some polynomial $ p(n)$ , where $ n$ is the length of the input. This probability can be amplified to $ 1-2^{q(n)}$ , for some polynomial $ q(n)$ by repeating the algorithm polynomially many times and taking the majority.

I was wondering if it is necessary to have this bound around the constant $ 1/2$ ? Can we have a randomised algorithm that answers correctly with probability $ c+1/p(n)$ for some $ c<1/2$ and still amplify the probability in polynomial time?

The proof for the case of $ 1/2 + 1/p(n)$ uses Chernoff bound on lower tail that requires $ 0 < \delta <1$ . $ \delta= 1-1/2p$ in that case which means $ p$ should be greater than $ 1/2$ . Proof here.

However here is a proof that weak BPP = Strong BPP where strong BPP is BPP as we know it and weak BPP is when the probability of correct answer is $ s(n)+1/p(n)$ , where $ p(n)$ is any polynomial and $ s(n)$ is any polynomial time computable function.