How to generate Deterministic finite automaton for given language

Problem: Write a program which generates Deterministic finite automaton which accepts given language. Language is defined with alphabet and start/end sub strings.

For example: Alphabet={a,b,c}; start sub string=”ab”; end sub string=”cb”

I need to construct DFA which will accept strings of kind: {abcb, abaacb, abcabcb…}.

Question: What algorithm should I use to write function which will construct DFAs for this set of problems.

How to generate short fixed length cryptographic hashs?

I am trying to implement a kind of email verification system with a node.js server with no state.

The strategy

  1. User sends his email to the server
  2. Server generate a 4 digits code based on the email address and sends it to the user via email.
  3. User sends back the received code via email + the email address to the server
  4. Server re-generates the 4 digits code based on the email and compares it with the code sent by the user.

My implementation to generate the 4 digits code

  1. Create a HEX digest using HMAC SHA-256 hash function
  2. Take the first 3 characters of the digest
  3. Convert them to an integer
  4. If length < 4, concatenates one or multiples 0 at the end
const crypto = require('crypto')  const get4DigitsCode = (message) => {   const hash = crypto     .createHmac('sha256', Buffer.from(SECRET_KEY, 'hex'))     .update(message)     .digest('hex')    const first3HexCharacters = hash.slice(0, 3)    const int = parseInt(first3HexCharacters, 16)    let code = int.toString()   code =     Array(4 - code.length)       .fill(0)       .join("") + code    return code } 

After generating codes for 8293 email addresses, I noticed that I had 4758 duplicates. Is it normal to have this amount of duplicates for a code as this sort ? Is my strategy and my implementation secure (ability to guess the code) ?

Generate all combinations of values that are less than array’s elements and have a sum = target

I want to find a way to generate sets that contain elements that sum to a certain target. Initially, I have an array that contains elements representing the maximum value that can be stored in that index.

For example, the input is [8,6,1] and the target is 10. The algorithm should produce all sets that have elements [ (<=8), (<=6), (<=1) ] such that their sum is equal to 10. Examples include: [8,1,1], [8,2,0], [7,3,0], …

A major consideration for this algorithm is that it should work on any input length (the above example has a length of 3).

I think the solution is close to the subset sum problem, but I wasn’t able to figure it out. Any help is appreciated.

Side note: python code is preferred, but Pseudo-code should be fine.

Thanks

How to generate a certificate with csr’s sans kept by using openssl x509?

I already know how to add sans to csr, and I know it’s viable to add once again to crt using openssl x509 like this openssl x509 -req -extfile < (printf "subjectAltName=IP:xxx" -days xxx -in xxx.csr -signkey xxx.key -out xxx.crt

but I want to find a way to do that in one command line without using config file.

thanks

Generate ppt slides from Sharepoint data

There is a form in SharePoint and I need to generate an one page Powerpoint slide using some data from that Sharepoint list. I’m thinking about creating a template in Powerpoint. Let’s say there will be 5 blocks in this template and there are “name”, “surname”, etc. sections in SharePoint list. I want to import this “name” to one of the blocks and so on. All of the data are text. Is writing such a program possible? If so, how can I do that?

Thanks in advance.

Scaffold context efcore does not generate .HasIndex statements

Given any schema (here’s one below). When running efcore scaffold-context, all goes well but the HasIndex statements are missing. Why ? Also, if efcore does not have those .HasIndex() will run queries differently ?

Scaffold-DbContext "Server=$  YourServer;Database=ContosoUniversity1;Trusted_Connection=True;" Microsoft.EntityFrameworkCore.SqlServer -OutputDir Models -Force   CREATE TABLE [Instructor] (     [ID] int NOT NULL IDENTITY,     [LastName] nvarchar(50) NOT NULL,     [FirstMidName] nvarchar(50) NOT NULL,     [HireDate] datetime2 NOT NULL,     CONSTRAINT [PK_Instructor] PRIMARY KEY ([ID]) ); 

how to generate mipmap, and check if done properly?

i am implementing Mipmapping for my texture for the first time.

i have 2 questions

  1. HOW TO CHECK IF MIPMAP IS DONE PROPERLY ?
  2. HOW TO SET MIPMAP FOR TEXTURE FROM FILE ?

First one. i implemented texture in application.

ID3D11Texture2D* TEX; D3D11_TEXTURE2D_DESC DESC; DESC.ArraySize = 1; DESC.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE; DESC.CPUAccessFlags = 0; DESC.Format = DXGI_FORMAT_R8G8B8A8_UNORM; DESC.Width = 256; DESC.Height = 256; DESC.MipLevels = 9; DESC.MiscFlags = D3D11_RESOURCE_MISC_GENERATE_MIPS; DESC.SampleDesc.Count = 1; DESC.SampleDesc.Quality = 0; DESC.Usage = D3D11_USAGE_DEFAULT; r_assert(graphic->Device()->CreateTexture2D(&DESC, nullptr, &TEX)); ID3D11ShaderResourceView* TEXV; D3D11_SHADER_RESOURCE_VIEW_DESC TEXVDESC; TEXVDESC.Format = DESC.Format; TEXVDESC.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; TEXVDESC.Texture2D.MipLevels = -1; TEXVDESC.Texture2D.MostDetailedMip = 0; r_assert(graphic->Device()->CreateShaderResourceView(TEX, &TEXVDESC, &TEXV)); graphic->DContext()->GenerateMips(TEXV); D3D11_TEXTURE2D_DESC desc; TEX->GetDesc(&desc); 

not sure, but i think i generate Mipmap for TEX and i just want to know the ways how to check/debug if mipmap actually done well.


Second. i am also using DirectX::CreateWICTextureFromFileEx to create texture from file.

resource=nullptr; newSRV=nullptr;  r_assert(     DirectX::CreateWICTextureFromFileEx(         graphic->Device(),         (L"Data\Texture\" + fileName).c_str(),         0,         D3D11_USAGE_DEFAULT,         D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE,         0,         D3D11_RESOURCE_MISC_GENERATE_MIPS,         DirectX::WIC_LOADER_DEFAULT,         &resource,         &newSRV) ); graphic->DContext()->GenerateMips(newSRV); ID3D11Texture2D* tex = nullptr; r_assert(     resource->QueryInterface(IID_ID3D11Texture2D, (void**)&tex) ); D3D11_TEXTURE2D_DESC desc; tex->GetDesc(&desc);  

and i checked desc.MipLevels is 1, which means there is no mipmapping. so wonder how to generate mipmap when using WIC function?

any advice would be appreciated^

Generate unique random string to be used as value for a field inside newly created rows in PostgreSQL

I have this statement which generate a random string:

SELECT string_agg (substr('abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789', ceil (random() * 62)::integer, 1), '') FROM generate_series(1, 20); 

My question, how to implement this inside a function which would assign the value of the above statement automatically to a field named url_prefix when new records are created (on INSERT's) ?

How can I generate a heat map from AnyDice results?

I want to show some effects of changing dice rolls on a heat map, where on one axis I have task difficulty (level of a spell, quality of lock to pick etc), and on another character level / proficiency bonus. Color should represent chance of success.

Usually, graphs from AnyDice are hardly useful to just look and easily understand; it generates many lines or bars, when I want it all on one map.

Here is an example program: https://anydice.com/program/1451d – fifty lines of probabilities; not so nice to look at, and not useful to post in answers here.

How can I generate a heat map from AnyDice results?

I want something like what is described here on a sister site: What is a probability heat map? I would prefer an option in AnyDice, if they implemented it and I just don’t see it. A tool that can take an export from AnyDice and generate a heat map would be great, too.

Using spreadsheet and conditional formatting, I managed to get roughly what I wanted, but it was a boring and lengthy process: Example

How to generate all combinations given an array of elements using backtracking?

Given an array, generate all combinations

For example:

Input: {1,2,3}

Output: {1}, {2}, {3}, {1,2}, {2,1}, {1,3}, {3,1}, {2,3}, {3,2}, {1,2,3}, {1,3,2}, {2,1,3}, {2,3,1}, {3,1,2}, {3,2,1}

I am practicing Backtracking algorithms and I think I understand the general idea of backtracking. You are essentially running a DFS to find the path that satisfies a condition. If you hit a node that fails the condition, exit the current node and start at the previous node.

However, I am having trouble understanding how to implement the traverse part of the implicit tree.

My initial idea is to traverse down the left most path which will give me {1}, {1,2}, {1,2,3}. However, once I backtrack to 1, how do I continue adding the 3 to get {1,3} and {1,3,2} afterwards? Even if I have 2 pointers, I would need it to point to the 2 to eventually get {1,3,2}.

Am I approaching this problem correctly by drawing this implicit tree and trying to code it? Or is there another approach I should take?

I am not looking for code to solve this, rather I am looking for some insight on solving these kinds of questions.

enter image description here