How machine learning libraries are created?

I would like to know how machine learning libraries (or in general libraries at large scale) are created. I mean Python doesn’t have inbuilt array system but c has. So how they are supported for Python and how do they start the thing and develop it as we know today as a final product (like NumPy) ?

P.S.- Let me know if this is not the right community for asking general computing questions because there is significant overlap among CS stack exchange forums and if it’s not right place then recommend the appropriate stack exchange platform for asking general computing questions.

Also, I couldn’t find relevant tags so had to tag it with machine learning.

Differences in certificate verification between ssl libraries

I’ve been playing with x509 certificates to better understand them and I’ve hit a strange issue which makes me think I have a misunderstanding. Initially I tested everything with libressl 2.8.3 and things work as expected, however when testing against openssl 1.1.1d things fall apart.

First I’ve created a root key and certificate with

libressl ecparam -out root.pem -name secp384r1 -genkey libressl req -new -key root.pem -out root.csr libressl x509 -in root.csr -out root.crt -req -signkey root.pem -days 30 

then the intermediate

libressl ecparam -out inter.pem -name secp384r1 -genkey libressl req -new -key inter.pem -out inter.csr libressl x509 -in inter.csr -out inter.crt -req -signkey root.pem -days 30 

and a leaf

libressl ecparam -out leaf.pem -name secp384r1 -genkey libressl req -new -key leaf.pem -out leaf.csr libressl x509 -in leaf.csr -out leaf.crt -req -signkey inter.pem -days 30 

The issue I’m hitting is that libressl will verify the intermediate cert while openssl will not

>>> libressl verify -CAfile root.crt inter.crt inter.crt: C = US, ST = CA, L = SF, O = Inter error 18 at 0 depth lookup:self signed certificate OK >>> openssl verify -CAfile root.crt inter.crt C = US, ST = CA, L = SF, O = Inter error 18 at 0 depth lookup: self signed certificate error inter.crt: verification failed 

Am I missing something or is openssl exposing that I have a misunderstanding of x509 certs and libre/openssl? Similarly validating the leaf cert with a bundle of the root and intermediate succeeds with libressl and fails with openssl.

How to deal with this fundamental problem with the advice: “Don’t trust obscure PHP libraries that nobody uses!”?

Frequently, I’d say in virtually every case, there is only one PHP library for any particular problem. (I don’t count obsolete, abandoned, trash ones.)

Therefore, it’s never a “choice” by me to use it. I have to either use it or nothing.

For this simple reason, the sound-sounding safety advice to “not use obscure libraries not promoted or used by lots of people and major corporations” is rarely applicable, because there just aren’t any alternatives to pick from!

And this is for PHP — one of the most popular/biggest/most used current programming languages on the planet. Imagine if I were using some far less popular language; I’d never find a library to do anything!

It seems like this advice only works in theory. In reality, there’s very little, if any, choice between libraries and even languages unless you are going to do everything on your own, from scratch. (Or possibly if you can pay money, which I cannot, and thus I’ve never even considered any potentially existing paid alternatives.)

The reason I ask this question is that I’m always given it as one of the main tips for how to stay secure and not get malware through compromised/evil PHP libraries. However, when there’s just one thing to pick, for example “MailMimeParser”, which nearly always seems to be the case (with any “alternatives” having major show-stoppers such as being dead or just not working as advertised), what else can I do?

How do we secure image parsing libraries against buffer overflow?

New to buffer overflow through image parsing. How can one design a secure library that parses images, and ensure there are no security vulnerabilities in it? It is common knowledge that image parsing libraries are vulnerable to Buffer Overflow, so I would appreciate it if someone could specifically explain how to secure image parsing libraries against buffer overflow.

Maximum number of libraries in one web

I’ve been having a look at this link as I’m looking at the options for a document repository with high volume and a rather complex security matrix.

I can see most of the limits I need to start playing with the IA options and pivoting the metadata around but I can’t see how many libraries I can have in one site.

The boundaries mention how to get up to 250.000 sites. that’s actually more than what I need, but site level might be an option for us. I was wondering if I could take it one level down. I was thinking of something like 20.000 libraries in a site.

Has anyone done anything like this? Any tips?

Struggling to build Qt due to missing OpenGL libraries

I’m trying to build an executable Qt-based program for Linux, and link Qt statically in the binary. According to the Qt docs, to do this I’m going to need to build Qt from source, starting with the commands:

cd /path/to/Qt ./configure -static -prefix /path/to/Qt 

when I try this, I get a long bunch of output ending with:

ERROR: The OpenGL functionality tests failed! You might need to modify the include and library search paths by editing QMAKE_INCDIR_OPENGL[_ES2], QMAKE_LIBDIR_OPENGL[_ES2] and QMAKE_LIBS_OPENGL[_ES2] in the mkspec for your platform.  Check config.log for details. 

This isn’t the first time I had problems with OpenGL and Qt… when I built my program with dynamic linking (using binary Qt libs) I got a similar problem, which I solved by symlinking to libGL.so from /usr/lib.

The output from configure said to go and find the “mkspec” for my platform, so I looked at Qt/5.13.1/Src/qtbase/mkspecs/linux-g++-64 (I explicitly told it to use that platform with the -platform command). I rooted around and tried setting those envvars to various combinations of paths that seemed plausible, to no avail.

It also mentioned a config.log file. When I look in this file, I can see open GL mentioned only in this block:

loaded result for library config.qtbase_gui.libraries.opengl Trying source 0 (type pkgConfig) of library opengl ... + /usr/bin/pkg-config --exists --silence-errors gl pkg-config did not find package.   => source produced no result. Trying source 1 (type makeSpec) of library opengl ... Include path /usr/include/libdrm is invalid. header entry 'config.qtbase_gui.libraries.opengl.headers.0' passed condition. GL/gl.h not found in [/usr/include/libdrm] and global paths.   => source produced no result. test config.qtbase_gui.libraries.opengl FAILED loaded result for library config.qtbase_gui.libraries.opengl_es2 Trying source 0 (type pkgConfig) of library opengl_es2 ... + /usr/bin/pkg-config --exists --silence-errors glesv2 pkg-config did not find package.   => source produced no result. Trying source 1 (type makeSpec) of library opengl_es2 ... None of [libGLESv2.so libGLESv2.a] found in [] and global paths.   => source produced no result. test config.qtbase_gui.libraries.opengl_es2 FAILED 

Indeed pkg-config --print-errors --exists gl can’t find anything. No idea where it’s getting that /usr/include/libdrm path…

Moving Folders Between Libraries in Different Sites Leaves Copies in Both

When I moved a folder from Site 1 Doc Library to Site 2 Doc Library (modern experience in both, btw), I ended up with copies in both locations. Is this expected behavior, that it doesn’t actually MOVE the files? Does this mean that the files in the new location will not retain their Doc IDs and links, as well? Thanks, in advance!