Is there a standard for “virtual receipts”, and is it actually used anywhere?

I just got another e-mail from my food store after I had placed an order. It has no plaintext version, only a HTML one. Only with extreme amounts of efforts from me could I parse out the products and their individual prices and quantities… until they change their e-mails the next time.

I currently "only" parse out the delivery date/time, the total price for the order and the order id. Which is insanity.

Is there really no "digital receipt" standard? They seem to have no hidden JSON/CSV blob anywhere in their e-mail, or even manually downloadable from their website when logged in. How is one supposed to actually make a local database of what they buy and the prices and stuff? Even just figuring out how to parse their e-mails for the total price was quite a bit of work, and I’m certain that almost nobody out there does this.

How come this was apparently overlooked, in spite of being such an important and basic thing for "e-commerce"? Am I really expected to manually input all of this data or spend countless hours figuring out their broken HTML blob and keep updating it whenever they change their e-mails, and do this for every single store I ever buy anything from?

I strongly suspect that there is some standard, probably released as an RFC in 1997 or something, but nobody wants to implement it because it means "giving away control" in their eyes?

Is there a way to store an arbitrarily big BigInt in a bit sequence, only later to convert it into a standard BigInt structure?

I am trying to imagine a way of encoding a BigInt into a bit stream, so that it is literally just a sequence of bits. Then upon decoding this bit stream, you would generate the standard BigInt sort of data structure (array of small integers with a sign). How could you encode the BigInt as a sequence of bits, and how would you decode it? I don’t see how to properly perform the bitwise manipulations or how to encode an arbitrary number in bits larger than 32 or 64. If a language is required then I would be doing this in JavaScript.

For instance, this takes bytes and converts it into a single bit stream:

function arrayOfBytesTo32Int(map) {   return map[0] << 24     | map[1] << 16     | map[2] << 8     | map[3] } 

How would you do that same sort of thing for arbitrarily long bit sequences?

How to set C++ language standard for VS2019 in an Unreal project?

I am trying to a simple thing, just like that, in a header file;

#include <filesystem> #include <iostream>   namespace fs = std::filesystem;   

And IntelliSense goes: namespace std has no member filesystem.

Okay no worries, it’s an easy fix. Just set the C++ language standard in the propery pages…

Well, it turns out it isn’t, it’s not an option in Unreal VS project. Tried typing in search bar, View -> Property pages, but no luck.

Okay let’s try doing the whole thing in a console project first.

Same message from IntelliSense as before.

Ok, no worries, I found this.

I found my settings under: Project > projectname Properties

And voila, the console app works.

Let’s try it in the Unreal project.

Well, well… My options are limited here.

myueoptions

I had a look around in the project settings as well:

projectsettings

How do I get this filesystem header work with my project?

Calculate combined standard deviation

If I have a data that I fit with NonlinearModelfit that fits a data based on two fitting parameters, c1 and c2.

When I used nlm["ParameterTable"] // Quiet I get the following table:

Image

If I have an equation such as:

eq = (2.303*((70 + 273.15)^2)*(c1/c2))/1000

Is there any code (as opposed to doing it manually) I can use to calculate the value of eq with the combined standard deviation based on the standard deviations of c1 and c2 from the table?

Thank you!

PyCrypto based encrypted Property class for Google App Engine standard – is this AES implementation secure

I have a need to encrypt OAuth access and refresh tokens stored in the Google Cloud Datastore.

The goal here is to ensure that if the Datastore entities are accessed independently of the code, the OAuth tokens will be encrypted and thus unusable.

This is not intended to protect against situations where both the code and Datastore are exposed together.

To securely store the data, I have leveraged PyCrypto’s AES implementation, and created a custom property type that automatically encrypts/decrypts the properties when accessing them.

The logic is as follows:

  • To store – I generate a random initialization vector, encrypt the data, then I base64 encode both the initialization vector and the encrypted data, and store them together in a text property.

  • To retrieve – I fetch the text data, slice off the base64 encoded initialization vector, and proceed to decode then decrypt the remaining data.

In addition to securing my own application, I am considering publishing the details and distributing the relevant code for others, so I want to ensure I have a secure or "correct" implementation of this functionality

(Note I have stripped out App Engine specific code and just included relevant encryption code here, for simplicity. The actual implementation allows it to be dropped into existing Datastore models in a backwards-compatible fashion).

 from Crypto.Cipher import AES   from Crypto import Random  from base64 import b64encode,b64decode  from meta_secure import aes_key #aes_key is a 32 digit alphanumeric string (GUID)    #encryption scheme  def encrypt_value(value):     rand = Random.new()     init_vector = rand.read(16)     aes = AES.new(aes_key,AES.MODE_CFB,init_vector)     encrypted = b64encode(aes.encrypt(value))     return '%s%s'%(b64encode(init_vector),encrypted)    def decrypt_value(value):     init_vector = b64decode(value[:24])     aes = AES.new(aes_key,AES.MODE_CFB,init_vector)     decrypted = aes.decrypt(b64decode(value[24:]))     return decrypted     

Have I used PyCrypto and AES correctly for the goal as stated above?

In standard 5e, does elf weapon training do anything useful at character creation?

Working just with the basic game (read: PHB, no expansions), I was trying to outfit an elven cleric, but the starting equipment only includes simple weapons (or war hammer, which is only useful for dwarves). Do I really have to start with suboptimal simple weapons and upgrade to a longsword or longbow (replacing mace/crossbow, respectively) as funds become available?

This seems to be a broader problem than just the one I see here, because every class that allows you to equip martial weapons at the start also grants martial weapon abilities — making the elf weapon training redundant. Did I miss some special rule that allows you to treat "race-specific weapons" as simple weapons for initial character creation? I’d even be happy if shortsword was "simple", but it isn’t — and playing an elf who favors dwarf weapons seems kind of silly.

Is this the correct “standard form” of nonlinear programming (optimization) problem and if it is why it’s in this form?

Rather a simple question I guess, though makes me wonder. The standard form I’ve found in the book (and on wiki) is something like this:

$ min f(x)$

$ s.t.$

$ h_i(x) = 0$

$ g_i(x) <= 0$

Is this considered a “standard form” for nonlinear optimization problems? And if it is why it’s defined like this? Why it has to be exactly the min of the function and why constraints have to be either equal or less than 0 or equal to 0? I couldn’t find any answer why it is as it is actually. Is there some important thing why it couldn’t be max actually for example?

What is the subspecies name for the standard race choices in the D&D 3.5 Players Handbook? What is appearance do these subspecies, and a few others?

I am playing D&D on a Neverwinter Nights Enhanced Edition module using the D&D 3.5 ruleset. Please note that this is not a Neverwinter Nights game question. This is a D&D 3.5 lore question.

I am having difficultly finding the subspecies name for the standard race choices offered in the Players Handbook. I am also having difficulty finding descriptions and images of their appearance online. I am avoiding 4e or 5e images and information because some of the lore has changed.

Here is my list of questions. Hair and skin is all I need for description. If you can have a picture link that would be very helpful. I will also gladly look at any online resource that answers my questions and saves people time from writing out their answers.

What is the standard elf race subspecies name in D&D 3.5? What is their suggested appearance?

What is the standard gnome race subspecies name in D&D 3.5? What is their suggested appearance?

What is the standard dwarf race subspecies name in D&D 3.5? What is their suggested appearance?

What is the standard halfling subspecies name race in D&D 3.5? What is their suggested appearance?

What does a deep dwarf look like?

What does a wild elf look like?

What does a wood elf look like?

What does a gray elF look like?

What does a forest gnome look like?

What does a lightfoot halfling look like?

What does a tallfellow halfling look like?

What is a tribal orc? What does it look like?

What is a deep orc? What does it look like?

Estimating users standard deviation given avg, min, max for various tests

Given a series of tests, where we are given one users score, the overall minimum, the overall maximum, and the overall average, how would I estimate the user’s standard deviation on total score (ie. sum of all their tests)?

We cannot assume that the lowest scoring person from one test was the lowest scoring in the next test, but I think it is fair to assume that people generally stay within some score bands (although if this can be done without that assumption, that would be better).

My intuition tells me that this seems to be some sort of application of Monte Carlo, but I can’t seem to figure out how to actually do this.