What is the fastest algorithm to establish whether a linear system in $\mathbb{R}$ has a solution?

I know the best algorithm to solve a linear system in $ \mathbb{R}$ with $ n$ variables is Coppersmith-Winograd’s algorithm, which has a complexity of $ $ O\left(n^{2.376}\right). $ $ How much easier is it to simply determine whether the same system has any solution?

More precisely, given a system of $ m$ equations and $ n$ unknowns, what is the complexity of establishing whether it has any solution?

Given a set of file URLs from sharepoint, what is the fastest way to download them all?

I have a list of SharePoint list item URLs and list item attachment urls. I need to download all the files from each one.

There are 50,000,000 files over 200 sharepoint farms. They are on a variety of SharePoint environments: 2010, 2013, 2016 and soon 2019. I also have a couple SharePoint online instances I need to download from.

We have limited the download to documents <= 80 megabytes. And have eliminated all documents that we don’t want to download.

So now I need the fastest way possible to download 50million files from sharepoint.

I am currently using a multi-threaded HTTP client program to download them. Each file is downloaded with a single HTTP request.

This is taking a very long time because it seems to tax the SharePoint web servers CPU pretty heavily. Especially if I turn up my number of download threads too high.

I read this article here https://www.itprotoday.com/5-reasons-why-you-have-sharepoint-performance-issues and it seems to say something about Blob to document conversion is the reason why. But I’m not positive.

Is there some faster way to download all these files using a batch operation? For example, some way to get several files at once from a single web request instead of one-at-a-time?

I was doing some reading here: https://docs.microsoft.com/en-us/sharepoint/dev/general-development/how-to-crawl-binary-large-objects-blobs-in-sharepoint

Is there any way we can use a direct connection to the database to bulk download content?

Or is there a way to use the export feature to somehow accomplish this?

50% OFF- India’s Fastest Unlimited SSD Hosting⚡Litespeed⚡Jetbackup⚡FreeDomain⚡SSL⚡1.41 $ Pm⚡Sytes.in

Sytes is one of the leading web hosting provider in India offering quality service with premium support. All our servers are specially developed on Linux hosting environment with SSD utilization to provide up to 30x more speed and performance, compared with a regular hosting provider.

We have specially crafted the fastest and feature-rich environment, capable of handling different cms applications like WordPress, Opencart,…

50% OFF- India's Fastest Unlimited SSD Hosting⚡Litespeed⚡Jetbackup⚡FreeDomain⚡SSL⚡1.41 $ Pm⚡Sytes.in

fastest way to upload google photos (unlimited storage)

I have about 190gb of photos. Mostly high-quality jpegs (+/- 5mb per file).

They’re residing on an HD on a Synology NAS. I can use an app called cloud sync and upload the photos to google drive. I turned on ‘show google photos as folder in drive’ but unfortunately all the files count up against the quota.

I don’t care about the quality and use ‘high quality’ option @ Google Photos, which has unlimited storage.

What would be the best way to upload all my photos?

I would guess I need some binary/batch command that convert all my JPG’s to a lower JPEG quality. I need to skip the JPEG’s which are already low quality. Then I need to upload them.

Most convenient would be if I could do all of this from my NAS.

Perhaps even more convenient would be to upload the originals and somehow get Google to convert all of this + make use of the unlimited storage 🙂

I’m on OSX, Mac high sierra.

Fastest way to migrate to a bigger SSD (18.04)

I’ve seen a few postings about migration with older ubuntu versions.

What’s the fastestest way to migrate 120GB (80GB full) to a 1TB SSD unter ubuntu 18.04?

  • live duplicate or via image possible
  • how to extend the new volume on the new SSD?

Because I’ve only a limited time window once I’ve started I’d be happy for suggestions for the fastest way to get new system up and running again.

Thanks for your help!

Fastest Possible SharePoint Online Modern Site

There is lots of documentation on best practices for SharePoint Online and how to optimize performance.

I’m looking for an exact description of how to create a brand new Modern Site that will perform as optimally as possible as a shared document repository and for basic lists? We don’t really need anything else.

What configurations should I really be making? CDN? Publishing + Output Caching? Team Site vs Communication Site? Other stuff?

I’m looking for a single, clear configuration state, not a combination of optimization guides.

Thanks.