Improving performance of moving arrays into an array

I am loading a bunch of arrays and putting them into a single array. The loading of the data is negligible, but putting it into combinedArray takes significant time (as in this code).

import time import numpy as np start_time = time.time()     st,en = 0,0    combinedArray = np.empty(shape = (30000*30000))    #Get the array that each process wrote to    for i in range(10):        #load the array        #ex loading         X_np = np.empty(shape = (30000*30000)/10)        st = en        en += np.shape(X_np[:])[0]        #add to the array list        combinedArray[st:en] = X_np[:]     print("Array Calc:",time.time()-start_time)   

What I have found is often someone talking about not using append, so I tried with creating the array first, but just moving it is time consuming. Any advice on how to optimize this is appreciated.

Improving Performance

We run a Magento 1.9 online store. It contains 110,000 products. We are running into issue around long page load times for category pages and product pages.

Dedicated WebApp server specs: – Apache 2.4 – PHP 5.6 – PHP.ini heavily tweaked, along with an 8GB Redis cache – 6 CPU Cores – 16GB Memory – SSD Storage

Dedicated Database Server: – MySQL 5.6 (tuned with MySQL Tuner). – 6 CPU Cores – 16GB Memory – SSD Storage

The site hasn’t launched yet as we are trying to fix these performance issues.

Seems to me the bottleneck is the database. I’ve read nearly every article I could find on improving performance but we are still at 6-11s load times.

[mysqld]  innodb_buffer_pool_instances = 11 innodb_log_file_size = 1024M innodb_buffer_pool_size = 12000M query_cache_size = 64M query_cache_type = 0 query_cache_limit = 2M key_buffer_size = 36M max_heap_table_size = 16M tmp_table_size = 256M join_buffer_size = 8M max_allowed_packet = 16M innodb_lock_wait_timeout=300 innodb_read_io_threads = 8 innodb_write_io_threads = 8 innodb_thread_concurrency = 12 max_connections = 500 thread_cache_size = 32 thread_concurrency = 12 innodb_flush_log_at_trx_commit = 2 datadir=/var/lib/mysql socket=/var/lib/mysql/mysql.sock symbolic-links=0 sql_mode=NO_ENGINE_SUBSTITUTION,STRICT_TRANS_TABLES  [mysqld_safe] log-error=/var/log/mysqld.log pid-file=/var/run/mysqld/mysqld.pid 

Actual server resource usage is low. I’d love any recommendations on how we can improve the load speed and increase server resource usage of CPU and Memory.

Improving performance for data row loop

I have a data table, filled with data from an sql view. Then, I have a need to populate specific rows with values, queried from oracle plsql, We do not use DBLINK.

I have come up with this approach:

foreach(DataRow row in tb.Rows) {     foreach(DataColumn column in tb.Columns)     {         var rowIndex = tb.Columns.IndexOf(column.ColumnName);         switch (column.ColumnName)         {             case "someColumn":                 column.ReadOnly = false;                 column.MaxLength = 500;                 oneIndex = tb.Columns.IndexOf("columname");                 twoIndex = tb.Columns.IndexOf("columname");                 threeIndex = tb.Columns.IndexOf("columname");                 row[rowIndex] = policyData                     .AccumulatedIntakeSumForEmployer(row[oneIndex].ToString(),                         row[twoIndex].ToString(),                         row[threeIndex].ToString())                     .ToString(CultureInfo.InvariantCulture);                 break; 

Problem is, when the row count is high (2000 rows or more) it will take a lot of time to populate the rows (15 rows to be modified, 45 columns to loop through)

Is there any way to improve this approach and create a faster sorting algorithm for the DataTable Not using Parallel.?

Website not improving for 3 years

I feel so discouraged having had a website for 3 years and I still haven’t reaped any noticeable benefits coming from it.
I used to write content once every week, only a couple ranked. For the past few months, I stopped
putting up new content because of school. Now I graduated and I have the time but I don’t know where
to start. The few posts that had ranked are no longer in the first page. I’m barely getting any views or clicks.
https://imgur.com/a/DCfPrAs
In the analytics and performance images, you can see how much my website has deteriorated. It’s like I’m
back to level 1. I made this website in the hope that I would have a personal income and become independent.
I’m only 17 and I have so much to learn. I’m still hosting this website thanks to my father’s support, but I
fear that he may tell me to take down the site as it’s useless and wasting money.
Can someone please give me an advice? I chose this niche because it’s something I’m interested about.
I didn’t know it would be this tough.
Here’s the site: guition.com

PHP – Improving a rule engine

I am trying to create a simple parsing rule engine, that can parse text according to some parsing rules the user can define.

The problem I’ve encountered is that my users can currently save text into to my database in two ways:

documents:

id | path           | content 1  | /mydoc.txt     | {"text":"My document text.\nIs awesome!\n\f"} 2  | /another.txt   | {"column":[{"1":[{"1":"A line in column 1.\n"}],"2":[{"1":"Another line.\n"},{"2":"Yet another in column 2\n"}]}]} 

So my users can parse a text string text: and column / table rows column:

I have created a class, that can parse rules:

ApplyParsingRules.php;

public function parseContent(array $  content, Field $  field) {     if ($  field->rules->count() > 0) {         $  result['text'] = $  this->parse($  content, $  field->rules);         $  result = json_encode($  result);     }      return $  this->data->fields()->attach($  field, ['content' => $  result ?? null]); }  /**  * Iterate through each rule and parse through the content.  *  * @return array  */ public function parse(array $  content, object $  rules) : string {     $  results = [];      foreach ($  rules as $  rule) {         $  class = $  this->getClass($  rule);         $  content = $  class->apply($  content);     }      return $  content; } public function getClass(FieldRule $  FieldRule) {     $  arguments = unserialize($  FieldRule->arguments);     $  class = 'App\StreamParser\Parser\ParsingRules\Text\' . Str::camel($  FieldRule->method);     return new $  class($  arguments); } 

And it is called like:

$  Parser = new ApplyParsingRules(); $  result = $  Parser->parseContent($  content, $  field); 

An example rule could be textReplace.php:

public function __construct(array $  arguments) {     $  this->search = $  arguments['search'];     $  this->replace = $  arguments['replace']; }  public function apply(array $  content): string {     return str_replace($  this->search, $  this->replace, $  content['text']); } 

Above setup works fine. I can provide the $ content['text'] from the database, which is basically:

My document text.\nIs awesome!\n\f 

However, I would like to allow for this to also parse the column data (for example, only perform text replacement in column 2, or capitalize everything in column 1, row 1.

Any tips / ideas on how I can improve my rule class to accept both the $ content['text'] and $ content['column'][$ i] ?

Improving myself and getting rid of my bad player habits

Title really says it all. For a period of time (one or two years? Hard to tell) I became oddly ill. Incredibly sleepy, irritable, forgetful and it was all I could do to participate in the game. The GM thought I was becoming apathetic to his world, the other players just assumed my character was being distant, and I struggled to roleplay and deal with some mechanical problems with the character and the system (lots of homebrew) as a whole. However, with medication, I slowly started coming around. This came at a cost, however.

My irritability became explosive. I said things that I regret to this day, and ruined some friendships as a result. Its been… almost 9 months since the last straw was broken and the long running game had to be put on hiatus. We had thought to try and talk things out, see if we couldn’t cement some positive habits in place of negative habits through a series of modules, each run by a different player, and then being run by a GM. Unfortunately, my schedule went haywire and this couldn’t happen. I decided to back out of the group until I could fix myself and so they could continue playing in other games.

Lately, I’ve noticed myself falling into bad habits. Lots of negativity toward certain mechanics or just some darker thoughts. Nothing to talk with a psychiatrist about, but more things that I’m trying to fix about myself and get rid of.

I’m wondering if anyone knows of a good way to help change the way a player plays without them being in a game? My schedule doesn’t allow me to play locally, and I don’t want to subject a gm to me if I relapse.

EDIT: Sorry if the tags weren’t correct. This is a difficult question for me to figure out.

What are the source and scopes for Improving Website ranking and traffic.

I have plenty of website of my own. But all have become stagnant.

Can you help me for improve its ranking and generate more traffic to the website? The website is of Insurance related.

Website: – www.monthtomonthcarsinsurance.com

Targeted region: – USA

Website about: – Car/Auto Insurance.

I am targeting to the keywords which are having every low competition and high search volume.

Currently I am ranking one for some of the high search volume Keywords for MonthtomonthCarsInsurance.com…

What are the source and scopes for Improving Website ranking and traffic.

Do 10 Pbn Posts Dofollow Backlinks To Website Improving Rank for $30

We provide High authority MANUAL High TF CF DA PA HomePage PBN BackIinks on worpress platform high-quality standards. Now with recent changes in the Google Algorithm, you need QUALITY BACKLINKS. And by quality backlinks, its meant: Contextual Backlinks from high DA(Domain Authority) and Page Authority (PA) expired websites (which are called gems in SEO industry) Key Features of our Service: WordPress Hosted Domains Home Page Backlinks on DA/PA Sites TF 10 to 40+ DA 10 to 40+ PA 15 to 40+ Avg TF CF DA PA would be 25+ 100% Manual Post All are Contextual Do Follow links Links from Aged powerful domains 24/7 Customer support We have a large blog network, in which all blogs have page authority 27 Plus. This service IS EXCLUSIVELY FOR QUALITY LOVERS who want natural links with relevant content on HIGH AUTHORITY sites. Such high metrics l!inks will definitely boost your SERP.

by: divyaMehta9512
Created: —
Category: PBNs
Viewed: 113


Improving speed of Binomial and Multinomial Random Draws

A number of users have discussed the speed of Random number generation in Mathematica.

The Binomial and Multinomial random number generators in Mathematica are fast if multiple draws are needed for the same distribution parameters. For example, generating 500000 draws from the Binomial distribution is very quick

In[30]:= AbsoluteTiming[  RandomVariate[BinomialDistribution[100, 0.6], 500000];]  Out[30]= {0.017365, Null} 

However, the speed is slow compared to that in R and Julia when the parameters change across draws, as may be required when performing certain Monte Carlo simulations.

For example, if we have a vector nvec that contains the number of trials for each draw and a vector pvec that contains the corresponding probabilities of success.

nvec = RandomInteger[{5, 125}, 500000];  pvec = RandomReal[{0, 1}, 500000]; 

Then we have

In[28]:= AbsoluteTiming[  Mean[Table[     RandomVariate[BinomialDistribution[nvec[[i]], pvec[[i]]]], {i, 1,       Length@nvec}]] // N  ]  Out[28]= {36.2144, 32.5283} 

This hit in speed most probably stems from how these are implemented internally in Mathematica.

Are there alternate methods that are fast for the case when the parameter distribution changes across the draws?