Offering a compressed file for download that’s around 1.2 GB, should we split it into smaller parts?

The file contains thousands of PDFs and will be posted on a government website. The primary audience is researchers and the media, desktop, not mobile. We plan to indicate the filesize. Given 1.2 gb is pretty big, are there any reasons we should split the file into smaller parts? Asking because I’m just not sure.

Can data be compressed through this hash function technique?

I’d like to know if this data compression scheme would work or not, and why:

Suppose we have a file. If we treat the bits that make up the file as the binary representation of a number n, we have n (of course, if the first bit is zero we flip every bit so that n is unique). Now we have the number n, and a boolean that informs us whether to flip all the bits of the binary representation of n or not.

My idea was approximating n from below (e.g. finding a relative big number raised to a relative big power, such as 17^6038) and then start to compute arbitrary hashes for all numbers from this approximated n to the real n, counting the number of collisions. When we finally get to n, we have the “collision state” of the hashes and then we output the compressed file, which basically contains information about how to get to the approximation of n (e.g. 17^6038) and the “collision state” for n (note that this “collision state” must also occupy very few bits, so I’m not sure this would be possible).

The decompression procedure would do a very similar process; it will approximate n (e.g. compute ~n as 17^6038) and then start to hash (i.e. apply a function and check the result) every single number (we could also check every 5 numbers or another divisor of n – ~n) until the “collision state” is the same as the specified in the compressed file. Once we match everything, we have n. Then, it would just be a matter of flipping every bit or not (as specified in the compressed file) and outputting to a file.

Could this work? The only problem I can think of is (besides the time required for processing) the number of collisions being extremely huge.

What is the difference between Memory, Real Mem, and Compressed Mem?

I’ve already seen this question:

  • What's the difference between Real, Virtual, Shared, and Private Memory?

but I think it might be outdated. Specifically, there is a Memory column, as well as Real Mem and Compressed Mem column. What is the difference, and why would the Real Mem ever be smaller than Compressed Mem?

(I’m using macOS Sierra 10.12, but I think I’ve seen this in slightly older versions as well.)

Activity monitor screenshot

Skimage Cannot find _tiffile module – Loading of some compressed images will be very slow

I am trying to run faster-RCNN on an Nvidia Xavier. I have followed this guide here and the process went fine. However, whenever attempting to run the demo I get this error:

/usr/local/lib/python2.7/dist-packages/skimage/external/tifffile/tifffile.py:299:  UserWarning: ImportError: No module named '_tifffile'.  Loading of some compressed images will be very slow. Tifffile.c can be obtained at http://www.lfd.uci.edu/~gohlke/ "ImportError: No module named '_tifffile'. " 

I have run pip install -U scikit-image and pip install -U tifffile to make sure they’re up to date. In that path there is a _tifffile.so but it is not being imported. In the directory above it I attempted to run python setup.py install but it fails saying it’s missing tifffile.c. When I follow the link from the warning I cannot find Tifffile.c.

In tifffile.py, this snippet is causing the issue:

    if __package__:         from . import _tifffile     else:         import _tifffile except ImportError:     warnings.warn(         "ImportError: No module named '_tifffile'. "         "Loading of some compressed images will be very slow. "         "Tifffile.c can be obtained at http://www.lfd.uci.edu/~gohlke/") 

Any help is appreciated!

Create archived version of a directory tree where only indevidual files are compressed

I would like to archive a project that includes many directories and subdirectories. I would like to have new separate directories with only files inside the directory tree compressed and directory tree preserved.

Example:

Origianl Structure  \Directory1\ └──TestFile1.txt └──TestFile2.txt └──Sub1-1\TestFile3.txt \Directory2\ └──\Sub2-1\     └──TestFile5.txt     └──TestFile6.txt  Archive (in new location, zip or gz)  \Directory1\ └──TestFile1.txt.zip └──TestFile2.txt.zip └──Sub1-1\    └──TestFile3.txt.zip \Directory2\ └──\Sub2-1\     └──TestFile5.txt.zip     └──TestFile6.txt.zip 

I’ve got a solution from a very helpful community in #bash @freeonde network:

find Documents/ -type f -exec sh -c 'for f do mkdir -p "/media/me/zipped/$  {f%/*}"; gzip < "$  f" > "/media/me/zipped/$  f.gz"; done' _ {} + 

Reason for this request:

  • Make file corruption less likely
  • Easer and faster to open smaller files than one huge or multiple splitted archive files
  • Preserve the directory structure

I’m would like to:

  • See other suggestions and get comments from the community here
  • Get a Similar solution for windows servers using cygwin or GUI tool

Is compressed air effective to clean a 2018 MacBook Pro keyboard which displays repeated keypresses?

My 2018 15″ retina MacBook Pro is starting to display the dreaded keyboard defects. Specifically, the “A” key started to register double keypresses. Initially this was very infrequent, but it appears to be getting worse. I would estimate that, in the last few minutes, about 5-10% of the time, a repeat keypress is happening.

Although I could take the computer to an authorized repair shop as it’s still under warranty (and I bought AppleCare for it), I know the turnaround time is fairly long, and this is the only computer I can realistically use without immensely disrupting my workflow. Also, there is no Apple Store within 500 km of my location. Therefore, I set out to look for DIY fixes.

I found this support document on Apple’s site recommending the use of compressed air as a possible fix to the issue. I have an oil-free air compressor that I could use to apply compressed air to the keyboard.

However, I’m somewhat afraid of doing this. Although this is an Apple recommended procedure, I can’t help but think this is aggressive procedure applied to a very delicate mechanism. In particular, I’m afraid my compressed air source may have a small rate of dust contamination which may actually worsen the problem. Has anyone actually used this procedure on a 2018 MacBook Pro keyboard, with the new silicone membrane, to see whether this recommendation is actually effective? I intend to postpone doing this until I see a success report.

Find one private key from a large list of compressed public keys secp256k1

I am willing to cooperate with you. Help me find one private key from a large list of compressed public keys secp256k1. In the list of 15 000 000 compressed public keys. For the method of searching for private keys you can apply: “Pollard’s rho method”, “Baby-step Giant-step method”. Or you can apply your method.

https://drive.google.com/drive/folders/1HByDJR9Ck5CdIwTl-v_IzcaVhsG8aKaA

Contacts:

EMAIL: pokgoip@gmail.com

VK: https://vk.com/mistercooper

FB: https://www.facebook.com/dmitry.bazhenovsky