## Speeding up the Rummikub algorithm – explanation required

Regarding this question: Rummikub algorithm.

I was reading the first part of the solution in the posted answer (specifically, when there are no jokers involved, all tiles are distinct and only four colours are involved). Then, I reached the part in which the says that the algorithm runs in $$O(ABCD(A+B+C+D))$$ time, which is easy to determine why.

However, he the goes on to saying that we can speed up the algorithm so as to run in $$O(ABCD)$$ time by changing "the recurrence to ensure this occurs only once while maintaining correctness, which leads to $$O(1)$$ time for every ‘cell’ in the DP-table".

My problem is: I do not see how this can be done. I have tried playing around a bit with the recurrence, but I do not see how it can be modified, or what else we should keep track of so that we can speed up the time.

## Speeding up NIntegrate and DensityPlot

I’m making a density plot of a spectral function `Sfull[kx, ky, kz, \[CapitalOmega]]` along a specified path in `{kx,ky,kz}`-space. The plot looks as it should, but it takes a long time to run (12+ hours). `Sfull` contains numerical integrals which I believe are the source of the problem. The evaluation of `Sfull` along the `{kx,ky,kz}`-space path could perhaps be made more efficient as well.

==============

Definition of `Sfull[kx_, ky_, kz_, \[CapitalOmega]_]`:

``\[Theta] = 1.119; Qx = 0; Qy = 0; Qz = 0; \[Alpha] = 0.2; S = 0.5; J = 1.2; \[Gamma][kx_, ky_, kz_] := (Cos[kx] + Cos[ky] + \[Alpha] Cos[kz])/(2 + \[Alpha]); \[Omega]k[kx_, ky_, kz_] := Sqrt[(1 + \[Gamma][kx, ky, kz]) (1-Cos[2\[Theta]] \[Gamma][kx, ky, kz])]; \[Epsilon][kx_, ky_, kz_] := 2 J S (2 + \[Alpha]) \[Omega]k[kx, ky,kz]; Ak[kx_, ky_, kz_] := 2 J S (2 + \[Alpha]) (1 + Sin[\[Theta]]^2\[Gamma][kx, ky, kz]); u[kx_, ky_, kz_] := Sqrt[(Ak[kx, ky, kz] + \[Epsilon][kx, ky, kz])/(2 \[Epsilon][kx, ky,kz])]; v[kx_, ky_, kz_] := Sqrt[(Ak[kx, ky, kz] - \[Epsilon][kx, ky, kz])/(2 \[Epsilon][kx, ky,kz])]; \[CapitalPhi]1[kx_, ky_, kz_, qx_,qy_,qz_] := (\[Gamma] [kx, ky, kz] (u[kx, ky, kz] + v[kx, ky,kz]) (u[qx, qy, qz] v[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] + v[qx, qy, qz] u[kx - qx +Qx, ky - qy + Qy,kz - qz + Qz]) + \[Gamma][qx, qy, qz] (u[qx, qy, qz] + v[qx, qy, qz]) (u[kx, ky, kz] u[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] + v[kx, ky, kz] v[kx - qx + Qx, ky - qy + Qy,kz - qz + Qz]) + \[Gamma][kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] (u[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] + v[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz]) (u[kx, ky, kz] u[qx, qy, qz] + v[kx, ky, kz] v[qx, qy, qz])); \[CapitalPhi]2[kx_, ky_, kz_, qx_,qy_,qz_] := \[Gamma][kx, ky, kz] (u[kx, ky, kz] + v[kx, ky, kz])(u[qx, qy, qz] v[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] + v[qx, qy, qz] u[kx - qx + Qx,ky - qy + Qy,kz - qz + Qz]) + \[Gamma][qx, qy, qz] (u[qx, qy, qz] + v[qx, qy, qz]) (u[kx, ky, kz] v[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] + v[kx, ky, kz] u[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz]) + \[Gamma][kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] (u[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] + v[kx - qx + Qx, ky - qy + Qy, kz - qz + Qz]) (u[kx, ky, kz] v[qx, qy, qz] + v[kx, ky, kz] u[qx, qy, qz]); \[CapitalSigma][kx_, ky_, kz_, \[CapitalOmega]_]:=(1/2)NIntegrate[(-Abs[\[CapitalPhi]1[kx, ky, kz, qx, qy, qz]]^2/(\[CapitalOmega]-\[Epsilon][qx, qy, qz] - \[Epsilon][kx - qx + Qx, ky - qy + Qy, kz - qz + Qz] + I 0.001) + Abs[\[CapitalPhi]2[kx, ky, kz, qx, qy, qz]]^2/(\[CapitalOmega] + \[Epsilon][qx, qy, qz] + \[Epsilon][kx + qx - Qx, ky + qy - Qy, kz + qz - Qz] - I 0.001)),{qy, 0, 2 \[Pi]}, {qz, 0, 2 \[Pi]}, Method -> "AdaptiveQuasiMonteCarlo", AccuracyGoal -> 4, MaxRecursion -> 10^4]; G[kx_, ky_, kz_, \[CapitalOmega]_] := (1/(\[CapitalOmega] - \[Epsilon][kx, ky, kz] + \[CapitalSigma][kx, ky, kz, \[CapitalOmega]])); AA[kx_, ky_, kz_, \[CapitalOmega]_] := ((-1/\[Pi]) Im[G[kx, ky, kz,\[CapitalOmega]]]); Sxx[kx_, ky_, kz_, \[CapitalOmega]_] := (\[Pi] S (u[kx, ky, kz] +v[kx, ky, kz])^2 AA[kx, ky, kz, \[CapitalOmega]]); Syy[kx_, ky_, kz_, \[CapitalOmega]_] := (\[Pi] S (u[kx, ky, kz] -v[kx, ky, kz])^2 AA[kx, ky, kz, \[CapitalOmega]]); Szz[kx_, ky_, kz_, \[CapitalOmega]_] := \[Pi]NIntegrate[((u[qx, qy, qz] v[kx - qx, ky - qy, kz - qz] + v[qx, qy, qz] u[kx - qx, ky -qy,kz - qz])^2 (1/Sqrt[2 \[Pi]] (0.1 J)) Exp[-(\[CapitalOmega] - \[Epsilon][qx,qy, qz] -\[Epsilon][kx - qx, ky - qy, qz - kz])^2/(2 (0.1 J)^2)]), {qx, 0, 2 \[Pi]}, {qy, 0, 2 \[Pi]},{qz,0, 2 \[Pi]}, Method -> "AdaptiveQuasiMonteCarlo", AccuracyGoal -> 4, MaxRecursion -> 10^4]; Sxx0[kx_, ky_, kz_, \[CapitalOmega]_] := (Sin[\[Theta]]^2 Sxx[kx,ky,kz, \[CapitalOmega]] + Cos[\[Theta]]^2 Szz[kx - Qx, ky - Qy, kz - Qz, \[CapitalOmega]]); Szz0[kx_, ky_, kz_, \[CapitalOmega]_] := (Cos[\[Theta]]^2 Sxx[kx - Qx, ky - Qy, kz - Qz, \[CapitalOmega]] + Sin[\[Theta]]^2 Szz[kx, ky, kz, \[CapitalOmega]]); Syy0[kx_, ky_, kz_, \[CapitalOmega]_] := (Syy[kx, ky,kz,\[CapitalOmega]]); Sfull[kx_, ky_, kz_, \[CapitalOmega]_] := Sxx0[kx, ky, kz, \[CapitalOmega]] + Syy0[kx, ky, kz, \[CapitalOmega]] + Szz0[kx, ky, kz, \[CapitalOmega]]; ``

Definition of `{kx,ky,kz}`-space path:

``kpath = {{0, {0, 0, 0, \[CapitalOmega]}}, {1, {\[Pi], \[Pi], 0, \[CapitalOmega]}}, {2, {\[Pi], 0, 0, \[CapitalOmega]}}, {3, {0, \[Pi], 0, \[CapitalOmega]}}, {4, {0, 0, 0, \[CapitalOmega]}}}; if = Interpolation[kpath, InterpolationOrder -> 1];  ``

Density plot:

``DensityPlot[{Sfull@@if[i]}, {i, 0, 4}, {\[CapitalOmega], 0, 10},ColorFunction -> "SunsetColors", PlotLegends -> Automatic] ``

This produces the plot

which takes over 12 hours to run. `\[CapitalSigma][kx, ky, kz, \[CapitalOmega]]` and `Szz[kx, ky, kz, \[CapitalOmega]]` contain the numerical integrals (which I would like to perform according to the adaptive quasi Monte Carlo scheme with `AccuracyGoal -> 4, MaxRecursion -> 10^4`) and I’d like to speed these up. I’m also unsure that using `Sfull@@if[i]` is the most efficient way to produce the density plot. I would eventually like to make the plot smoother by increasing `PlotPoints`, but this would of course increase the runtime further.

I have tried `SymbolicPreprocessing->0` and defining functions with `?NumericQ` variables but this does not speed up the integrals noticably.

## Speeding up thousands of string parses

I have a mapping application which takes string arguments in the form of string arrays. I parse these and then perform an action on the map. One of these is `MoveVertex` which lets the user pass in the name of an existing shape/feature on the map, the index of the vertex to move, and a new x,y location to move that vertex to.

I’m currently getting thousands of these in a row, all for the same shape. It takes anywhere from 3-10ish seconds to parse them all and complete the action. I’m looking for any improvements I could make.

One thing I did so far was stop the map from refreshing every time a vertex is moved. This helped a bit, but it’s still slow.

Example of the string[] commands being sent:

``command[0] = "move" //command type command[1] = "vertex" //what to move command[2] = "testLine[10]" //move the vertex at index 10 in testLine's list of vertices command[3] = "x" //new x value for the vertex command[4] = "y" //new y value for the vertex ``

Here’s the function that does the work:

``public static void MoveVertex(string[] command, Layer inMemoryFeatureLayer, Map map) {     int index; //this will be the index of the vertex in the collection of vertices of the shape     string featureName; //name of the shape/feature that has the vertex to move      if (command.Length < 5) //can't be more than 5 parameters     {         return;     }      try     {         if (!command[2].Contains('[')) //if it doesn't contain brackets which contain an index of the vertices         {             return;         }          const string pattern = @"\[(.*?)\]";         var query = command[2];         var matches = Regex.Matches(query, pattern); //Gets anything inside the brackets         index = Convert.ToInt32(matches[0].Groups[1].Value); //should be an int          featureName = command[2].Substring(0, command[2].IndexOf('[')).ToUpper(); //everything before the bracket is the name of the object     }     catch (Exception ex)     {         return;     }      if (!double.TryParse(command[3], out double longitude) || !double.TryParse(command[4], out double latitude))     {         return;     }      try     {         BaseShape shape;          inMemoryFeatureLayer.Open(); //make sure the layer is open         if (inMemoryFeatureLayer.Name.Contains(GlobalVars.LineType)) if it's the layer holding line shapes         {             shape = (LineShape)inMemoryFeatureLayer.FeatureSource.GetFeatureById(featureName, ReturningColumnsType.NoColumns).GetShape();             ((LineShape)shape).Vertices[index] = new Vertex(longitude, latitude); //set the vertex to a new location         }         else //it's the layer holdilng polygone shapes         {             shape = (PolygonShape)inMemoryFeatureLayer.FeatureSource.GetFeatureById(featureName, ReturningColumnsType.NoColumns).GetShape();             ((PolygonShape)shape).OuterRing.Vertices[index] = new Vertex(longitude, latitude); //set the vertex to a new location         }          inMemoryFeatureLayer.EditTools.BeginTransaction();         inMemoryFeatureLayer.EditTools.Update(shape);         inMemoryFeatureLayer.EditTools.CommitTransaction();         map.Refresh(); //this won't happen because I suspend refreshes beforehand     }     catch (Exception ex)     {         //log it     } } ``

## CentOS 7 LAMP Server Tutorial Part 5: Speeding up WordPress with Redis

In the previous CentOS 7 LAMP Server Tutorials, we configured a LAMP stack, secured it with Let’s Encrypt SSL certificates, and installed WordPress with WP-CLI. Here in Part 5 we’re going to up our game just a bit more by installing server side caching with a program called Redis. Then we’ll configure WordPress to use Redis and check our performance to see if Redis actually made an impact. Let’s get started!

# What is Redis?

According to redis.io, Redis is a “data structure store” which is used as an in memory database. Because it stores information (called “objects”) in memory, it saves the server the trouble of having to look in its database. In this way it serves as a cache, and as such can help the server handle more traffic without having to work harder to do it.

By default, WordPress doesn’t know how to use Redis. We’re going to enable WordPress to use Redis by installing a WordPress caching plugin called “W3 Total Cache”. First, we need to install Redis and get it started.

# Installing Redis

Redis is available from the CentOS “Extra Packages for Enterprise Linux” (EPEL) software repository. If you’ve followed this tutorial series from the beginning, then the CentOS EPEL repository is already enabled. If not, then you’ll need to enable it now using the following command:

`yum -y install epel-release`

Now you’re ready to install Redis. It’s all too easy, just a quick Yum install. Run the following command:

`yum -y install redis`

Here’s how that looked on our VPS:

With Redis now installed, we need to start Redis for the first time and configure CentOS to launch Redis at startup. Run the following commands:

`systemctl start redis systemctl enable redis`

Here’s how those commands looked on our VPS:

Lastly, we need to make sure that Redis is actually started and that it is reachable. Much like PHP-FPM’s default configuration, Redis listens on the “localhost” IP (127.0.0.1) on a specific port (6379). Other programs can communicate with Redis by making a connection to that IP and port.

We’re going to test that communication now using the redis-cli program. Just like you can ping an IP with the “ping” command, you can also ping Redis with the “redis-cli ping” command. If Redis is running properly, it’ll simply reply with “PONG”. Give it a try:

`redis-cli ping`

Ours was running properly, and here’s how it looked:

If you got “PONG”, then great! Redis is running and you’re ready to configure WordPress to take advantage of it. Almost. First, we need to get PHP up to speed.

# Enabling Redis in PHP 7.3

Now that we know that Redis is working, we need to give PHP the ability to talk to Redis. Thankfully, this is as simple as installing the PHP 7.3 Redis extension and restarting PHP-FPM. If you’ve followed our tutorial, then the following commands will do the job:

`yum -y install php73-php-redis systemctl restart php73-php-fpm`

# Configuring WordPress to use Redis

As previously mentioned, WordPress doesn’t have the know-how to use Redis. It also doesn’t have any built in caching. We will solve both of those problems by installing the W3 Total Cache plugin for WordPress. Log in to your WordPress Dashboard, and go to Plugins > Add New. Search for “W3 Total Cache” and click on “Install Now” as shown below:

Click on Activate to enable W3 Total Cache. You’ll see a new menu item in the WordPress Dashboard: “Performance”. Click on Performance > General to get into the W3 Total Cache settings.

Go to each cache setting and enable it, and choose “Redis” from the drop down if it is available. If Redis is not available on any of the dropdowns and is greyed out, then it’s likely that either Redis isn’t installed, running, or enabled in PHP. You’ll need to retrace the previous steps taken before continuing.

Once all of the caches are enabled and Redis is selected, click on “Save All Settings” as shown below:

Check the function of your site. If your site doesn’t look correct, you may need to change the Minification settings, as those can often cause problems with CSS and Javascript. It’ll take some trial and error. Once your site is working correctly, we can continue.

Configuring WordPress for Redis really is that easy! But it doesn’t tell us whether it’s doing us any good. We want to know: Is Redis doing its job, and making an impact on speed? Let’s find out!

# Testing Redis

To know whether Redis is making a positive impact on your site, we’ll need to do some load testing. For that we used LoadImpact.com. Their free account will allow you to use up to 50 Virtual Users for 12 minutes at a time to load test your site. We configured a test for 50 Virtual Users (VU) for 3 minutes using the “Stress” ramp-up setting as shown below:

Both tests were run with W3 Total Cache installed. The “Disk Caching” test was done with W3 Total Cache configured for “Disk” or “Disk: Enhanced”. The “Redis Caching” test was done with Redis configured as previously mentioned. You can see the results in the chart below. Disk caching is Blue, while Redis caching is Red, of course.

With Disk caching enabled, our WordPress site served 27 requests per second with an average response time of 22ms. For a single core VPS with 512mb RAM, this fine.

When we reconfigured the site to use Redis, there was a very nice improvement. The server was able to serve 35 requests per second, with a slightly higher average response of 34ms. The higher response time is likely due to the server being busy handling so many more requests. Still, 34ms is quite good and completely acceptable.

On the other hand, 35 requests per second vs 27 is VERY good. It’s a 30% increase in performance, and all we had to do was install and configure Redis!

While we’ve covered the general installation and utilization of Redis, we’ve only really scratched the surface. Redis can be configured in countless different ways. Tuning for higher performance is possible, and fine tuning it for your specific requirements may yield even better results. You can learn more about Redis in the official Redis documentation: https://redis.io/documentation

# Are we done?

Not quite! There’s still more performance to be had. In the next installment of this series, we’re going to make even more use of caching by replacing Apache with a caching web server called Nginx (pronounced “Engine-X”). Check back soon!

The post CentOS 7 LAMP Server Tutorial Part 5: Speeding up WordPress with Redis appeared first on Low End Box.

## Way for speeding up apache/php

I’m struggling to find a way for speeding up my apache setup. I’m using newest Debian 9 release, my VPS have 4 powerful cores, 8GB RAM and SSD storage. It’s very reputable provider and everything works like a charm. Except woocommerce.

My setup: vestacp with redis (for caching purposes) ngnix + apache 2.4.25 (as backend) (prefork) and php7.3. OPcache is configured too. DB is optimized via mysqltuner tips (and DB queries are working lie a charm, woocommerce product search is blazing fast). Swappiness is set to 0.

I’m hosting few wordpress sites and 2 woocommerce stores (few addons and semi-advanced themes and… wpml – which is probably the true villain here, because if I turn it off, everthing is working fine… but we need to provide 2 languages). Plain woocommerce sites are opening and working pretty fast (almost instant), but woocommerce stores… Well, not that fast.

Of course, someone could say that 2 sec load time is okay, but I want to make it as snappy as possible.

The thing is, CPU is almost idling when page is loading. None of my 4 cores don’t even go above 40% no matter what I’m doing. RAM is not even filled in more than 1/4 of total available memory (even if redis is turned on and DB is tuned for more than 4GB usage). As I said, woocommerce search is blazing fast, I’m getting results in less than 1 second and we have over 2000 items added in our store, so I’m almost certain that it’s not DB thing.

I will be thankful for any advice.

## Speeding Up Excel Distance Calculation Using Bing API Calls

I am writing VB code in Excel to calculate the distance between an employee’s home address and work address using Bing Maps API calls. The process follows this general flow:

1) Convert the employee’s address to Lat-Long values using the GetLatLong function

2) Convert the employee’s work address to Lat-Long values using the GetLatLong function

3) Calculate the distance between these two points using the GetDistance function

4) Calculate the drive time between these two points using the GetTime function

The process is working, but it is excruciatingly slow. The employee population is approximately 2300, and it takes almost an hour to execute.

I am not a coder, but I can functionally modify found code to my purposes. This is an amalgamation of a couple different process found through Google searching. The code pieces in use are:

``Public Function GetDistance(start As String, dest As String)     Dim firstVal As String, secondVal As String, lastVal As String     firstVal = "https://dev.virtualearth.net/REST/v1/Routes/DistanceMatrix?origins="     secondVal = "&destinations="     lastVal = "&travelMode=driving&o=xml&key=<My Key>&distanceUnit=mi"     Set objHTTP = CreateObject("MSXML2.ServerXMLHTTP")     Url = firstVal & start & secondVal & dest & lastVal     objHTTP.Open "GET", Url, False     objHTTP.setRequestHeader "User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)"     objHTTP.send ("")     GetDistance = Round(WorksheetFunction.FilterXML(objHTTP.responseText, "//TravelDistance"), 0) & " miles" End Function  Public Function GetTime(start As String, dest As String)     Dim firstVal As String, secondVal As String, lastVal As String     firstVal = "https://dev.virtualearth.net/REST/v1/Routes/DistanceMatrix?origins="     secondVal = "&destinations="     lastVal = "&travelMode=driving&o=xml&key=<My Key>&distanceUnit=mi"     Set objHTTP = CreateObject("MSXML2.ServerXMLHTTP")     Url = firstVal & start & secondVal & dest & lastVal     objHTTP.Open "GET", Url, False     objHTTP.setRequestHeader "User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)"     objHTTP.send ("")     GetTime = Round(WorksheetFunction.FilterXML(objHTTP.responseText, "//TravelDuration"), 0) & " minutes" End Function  Public Function GetLatLong(address As String, city As String, state As String, zip As String)     Dim firstVal As String, secondVal As String, thirdVal As String, fourthVal As String, lastVal As String     firstVal = "https://dev.virtualearth.net/REST/v1/Locations?countryRegion=United States of America&adminDistrict="     secondVal = "&locality="     thirdVal = "&postalCode="     fourthVal = "&addressLine="     lastVal = "&maxResults=1&o=xml&key=<My Key>"     Url = firstVal & state & secondVal & city & thirdVal & zip & fourthVal & address & lastVal     Set objHTTP = CreateObject("MSXML2.ServerXMLHTTP")     objHTTP.Open "GET", Url, False     objHTTP.setRequestHeader "User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)"     objHTTP.send ("")     GetLatLong = WorksheetFunction.FilterXML(objHTTP.responseText, "//Point//Latitude") & "," & WorksheetFunction.FilterXML(objHTTP.responseText, "//Point//Longitude") End Function ``

To be clear, the process works well, just extremely slowly. Any thoughts on speeding this up?

Thanks, Lee

## Speeding violations in France by UK registered car

Ok now i was in France the previous weekend and was running late to make it for the euro tunnel back. So i stepped on the accelerator and drove above the speed limit. WAZE didnt show me any speed cameras so i didnt bother to check the road signs.

Now i believe i have gone past a number of speed cams above the limit.

I am worried of what to do if i receive fines at home.

What will happen if i dont pay it??

Not proud of speeding but i did commit the stupid act. 🙁

## Never Received German or Swiss speeding tickets

A little background information. In June 2017 I was “blitzed” two times in Germany and one time in Switzerland while visiting friends and family. Sure enough, I received three emails and three letters in the mail from Enterprise Germany. I have waited months without receiving any citations from the local authorities from where I was speeding. Worried, I emailed the Enterprise Ordungswidrigkeiten department in the email addressing my concern with the Ordungswidrigkeiten numbers hat I have received from them. I have never heard back from them. I also called the Enterprise support number here in the states with all the information and my concerns and yet said they would reach out to their European headquarters. So now being 2019 and I have never heard back with multiple attempts trying to get this resolved unsuccessfully. I am planning on going back and renting a car this summer. I don’t know what else to do or what consequences I will face when I enter Germany. By the way, I am a dual national (German/American) if there are any additional consequences when I enter the country with my German passport.

## Speeding up website for better rankings

Hi all,

Not sure if this is in fact the right section- I only have basic knowledge of working with website stuff that is self taught and I would consider myself extremely beginner.

That being said I have been noticing my web traffic going down over the past few months and I have now dropped from 1000 page views a day to 200-400. I have been reading around and some things have pointed to Google ranking sites based on mobile optimization and speedy loading times.