Too many datadabse connections – upper ceiling mysql

We run a high traffic blog with around 500 real-time users and 10000 posts. We are hosted by self on VPS – 16gb, 4 core with separate mysql DB server – 4gb, 2 core; both SSD. No problem with hosting server But the database server at times running out of connections, though we have more than sufficient 312 max cconnections ceiling. We don’t have any stray plugins and our maximum execution time is around 5 minutes. Anyone have any idea regarding what might be causing the issue. peaks the maximum 312 connection-db Database connections - peaks at working hours

More insights: What we could correlate is that the days when our bloggers (max 10) work from our office under a single IP and slow -interrupted network connection this issue happens to be more frequent. The issue is more frequent at peak publishing hours than during idle hours.

Or is the database server running out of ram which is eventually causing the connections blackout. Search engine crawls usually drink more Cpu but the bots don’t establish more than 30 db connections and our servers are totally fine handling the bots.

Also prevelant when a single admin keep the posts in edit mode for long periods of time

wp config details

‘WP_MEMORY_LIMIT’, ‘256M; ‘AUTOSAVE_INTERVAL’, 160; max_execution time 500

Our query monitor has some PHP 7.2 depreciation warnings thought the theme is stable with some slow queries, also posts revisions are vital for us.

[ Law & Ethics ] Open Question : If drugs are illegal in USA, how do so many celebrities use them?

Celebrities like Miley Cyrus have admitted to using marijuana and cocaine. I’ve looked up the list of legal drugs to use as recreation in USA, and marijuana and cocaine are not among them. I’m confused. If drugs like marijuana and cocaine are prohibited and banned in USA, how do celebrities access them, boast about it and not be arrested?

Oracle impdp onlu one schema when expdp its use many schemas

I have the following problem and doubt in Oracle … I am trying to import with impdp a save from my database made in which I export several schemes … the problem comes when I try to import into the same database in another scheme completely empty one and only one of those schemes that are in the export (expdp) …. during the import show several errors of creation, of sequences, of data that already exist and of creation of other objects in general. How is this possible if the scheme where I import this empty? the expdp I execute it in the following way:

To export: 

expdp \ "/ as sysdba \" DIRECTORY=logic_bakups DUMPFILE=backup%U.dmp FLASHBACK_TIME=timestamp PARALLEL=5 SCHEMAS=SCH1,SCH2,SCH3,SCH4

To import:

impdp \ "/ as sysdba \" DIRECTORY=imp_logic_backups DUMPFILE=backup%U.dmp PARALLEL=5 REMAP_SCHEMA=SCH3:SCH3_V2 REMAP_TABLESPACE=source_tablespace:target_tablespace_V2

Thank you

Many DNS records after creating a subdomain in cPanel

I am new to cPanel but I think I understand the basics of DNS and DNS zones enough.

When I add a new subdomain in cPanel, like sub.example.com, via the subdomain screen, the following records appear automatically in the DNS Zone Editor:

www.sub.example.com webmail.sub.example.com cpanel.sub.example.com cpcontacts.sub.example.com 

I don’t need all these records the only one I want is the A records of sub.example.com and deleting them one by one is a nightmare. There’s like maybe 30 records and I want to add several subdomains.

Is there a way to prevent these additional entries from being created or at least to delete multiple entries at the same time?

Bounded function with finitely many discontinuities is integrable $\overset{?}{\Rightarrow}$ density of continuous distribution function is not unique


The density function of the distribution function of a continuous random variable is not uniquely defined.
A new density function can be obtained by changing the value of the function at finite number of points to some non-negative value, without changing the integral of the function. We then get a new density function for the same continuous distribution.

Does this follow from the theorem-

A bounded function with finite number of discontinuities over an interval is Riemann integrable.

or is there a different theorem supporting the above claim? Is the theorem a sufficient justification?

What is the practical limit to how many object classes you can detect with Faster RCNN?

I am trying to follow this tutorial where the Faster-RCNN-Inception-V2-COCO model from TensorFlow’s model zoo is used to detect playing cards. I was wondering what is the practical limit to the number of object classes that I could use this to detect, specifically what I would want to do is to detect every letter in the English language(distinguish capital and lowercase from each other), number and mathematical symbol. Would this model work with that many different classes?

Also If I want to detect some words, would it make sense to label all characters that make up a word and also label the word(assuming there are only a few important words I want to actually detect).

Am I stuck resolving many ThreadSafeReferences to use Realm with asynchronous tasks like network requests?

I have a set of NSOperations which make network requests. Imagine a case where we have a User object in Realm and I want to make some changes and then send a PATCH request to the server:

let operation = UpdateUserOperation(updatedUser) 

Then the operation runs on a different thread so it has to resolve a thread safe reference:

class UpdateUserOperation : Operation {     var userReference : ThreadSafeReference<User>      init(_ user: User) {         userReference = ThreadSafeReference(to: user)     }      func main() {         // We're probably on a different thread so resolve the reference         let user = try! Realm().resolve(userReference)          // That invalidated `userReference` so we need to create a new         // one to use below...         userReference = ThreadSafeReference(to: user)          sendUserPatchRequest(user) { response in             // We might be in _another_ thread again :(             let realm = try! Realm()             let user = realm.resolve(userReference)             try! realm.write {                 user.updateFromResponse(response)             }         }     } } 

This feels like a really unclean way to do this – re-fetching the user so many times to do a pretty simple task. It feels especially onerous because we need to re-up the thread-safe reference – they aren’t re-usable. In Core Data, we’d be able to choose a single NSManagedObjectContext to do our work in and ensure thread safety by using managedObjectContext.perform { /* ... */ }, but that sort of functionality is unavailable in Realm.

Am I missing anything? Is there a better way to do this, or am I stuck re-fetching the object each time I need to use it?