Using a fix $key and variable $data vs partially variable $key with fixed $data in PHP’s hash_hmac()

This question was originally asked in stack overflow, but it was suggested to ask it here as well..

  • not looking to improve on hash_hmac functionality. I’m rather interested in the $ uri in the examples below..

The theory is that typically we create signed URI’s like

$  superSecret = 'abc'; $  data = 'https://localhost/verify/{user-id}/{email}'; $  hash = hash_hmac('sha256', $  data, $  superSecret);  $  uri = $  data . '/?hash=' . $  hash; 

Then we can validate the signature by recreating a hash, and calling hash_equals(). If any of part of the data string changed, hash_equal() returns false.

What happens if we switch some parameters around. This time instead of hashing different data, we hash the same data every time but with different keys.

I.e.

$  superSecret = 'abc' . $  userId . $  email; $  data = 'https://localhost/verify'; $  hash = hash_hmac('sha256', $  data, $  superSecret);  $  uri  = $  data; 

The above are dumbed down generalized examples. But I’m more interested in, is the concept correct? Would using different keys to hash the same data be as secure as using different data hashed by the same key.

Keep in mind that the ‘abc’ of $ superSecret is never exposed. $ user-id and $ email are concatenated onto the end of $ superSecret

The original question for those interested https://stackoverflow.com/questions/60401068/is-using-a-variable-key-with-constant-data-as-secure-as-using-a-constant-key-wit?noredirect=1#comment106850148_60401068

PTAS for Multiple Knapsack with Uniform Capacities, fixed number of Knapsacks

Consider the following problem:

We are given a collection of $ n$ items $ I = \{1,…n\}$ , each item has a size $ 0 < s_i \le 1 $ and a profit $ p_i > 0 $ . There are $ m$ (a fixed number) of unit-size knapsacks. A feasible solution is an $ m$ -tuple $ U=\{U_1,…,U_m\}$ , such that the size of items in each knapsack doesn’t exceed its capacity, and each item is packed in no more than one knapsack. more formally:

  • for every $ j, 1 \le j \le m, U_j \subseteq I$ and $ \sum_{i \in U_j}s_i \le 1$
  • for every $ j,l, 1 \le j < l \le m, U_j \cap U_l = \phi $

the profit of the feasible solution is $ \sum_{j=1}^m\sum_{i \in U_j}p_i $ . The goal is to find a feasible solution of maximal profit.

I’m trying to show a PTAS for the problem.

It was suggested to use linear programming. I thought about the following (basic) linear program:

maximise $ \sum_{j=1}^m\sum_{i=1}^n x_{ij}p_i $

under the constraints:

  • for every $ 1 \le j \le m \sum_{i=1}^n s_ix_{ij} \le 1 $ (the size of items in each knapsack doesn’t exceed its capacity)
  • for every $ 1 \le j \le n \sum_{j=1}^m x_{ij} \le 1 $ (each item is packed in no more than one knapsack).

I don’t know how to proceed from here. I’m not sure how to develop an algorithm (choose in which knapsack to put each item) based on this linear program. Can anyone pls give me a clue?

Thanks.

load local data on fixed width file inserting all columns with NULL values

Last month I had occasion to load about 50 GB of data from fixed-width dat files. There were 7 files so I created 7 tables. I created 7 LOAD DATA LOCAL INFILE scripts to load the data and it all worked fine resulting in over 70 million rows. The data is available with updates on the 10th of the month so I downloaded the files and ran my scripts on them and only 1 worked and the rest loaded the number of rows but with all of the columns NULL. I’ve compared the previous files with the new ones and cannot find any difference that would cause this. I’ve put a small amount of data that’s failing in a test.dat file and am getting the same results, but have not been able to determine why this is happening or why one works and the rest don’t? I don’t see any difference between the files or the sql that loads them? I’ve tried changing the encoding, line endings, permissions, ownership, and various other things without success. There are no errors thrown, it just loads NULL values. Has anyone ran across this before?

Here is an example table with the load sql

DROP TABLE IF EXISTS exp_gpoper; CREATE TABLE `exp_gpoper` (   `_id` int(11) NOT NULL AUTO_INCREMENT,   `date_updated` datetime NOT NULL DEFAULT current_timestamp() ON UPDATE current_timestamp(),   `pun_county_num` varchar(100) DEFAULT '',   `pun_lease_num` varchar(100) DEFAULT '',   `pun_sub_num` varchar(100) DEFAULT '',   `pun_merge_num` varchar(100) DEFAULT '',   `company_number` varchar(100) DEFAULT '',   `company_name` varchar(300) DEFAULT '',   PRIMARY KEY (`_id`),   KEY `date_updated` (`date_updated`),   KEY `pun_county_num` (`pun_county_num`),   KEY `pun_lease_num` (`pun_lease_num`),   KEY `pun_merge_num` (`pun_merge_num`),   KEY `pun_sub_num` (`pun_sub_num`),   KEY `company_number` (`company_number`),   KEY `company_name` (`company_name`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1;  LOAD DATA LOCAL INFILE 'test.dat'  INTO TABLE exp_gpoper  (@_row)  SET `pun_county_num` = TRIM(SUBSTR(@row,1,3)),  `pun_lease_num` = TRIM(SUBSTR(@row,4,6)),  `pun_sub_num` = TRIM(SUBSTR(@row,10,1)),  `pun_merge_num` = TRIM(SUBSTR(@row,11,4)),  `company_number` = TRIM(SUBSTR(@row,15,7)),  `company_name` = TRIM(SUBSTR(@row,22,255)); 

Here is the content of the test.dat file:

001000000000000077777OTC USE                                                                                                                                                                                                                                                         003000000000000077777OTC USE                                                                                                                                                                                                                                                         003000567000000020011M & D PUMPING SERVICE INC                                                                                                                                                                                                                                       003000587000000022576SCOGGINS PRODUCTION LLC                                                                                                                                                                                                                                         003000588000000022576SCOGGINS PRODUCTION LLC                                                                                                                                                                                                                                         003000639000000017441CHESAPEAKE OPERATING LLC                                                                                                                                                                                                                                        003000963000000019694BVD INC                                                                                                                                                                                                                                                         003000964000000018119BLAKE PRODUCTION CO INC                                                                                                                                                                                                                                         003002207124830022281SANDRIDGE EXPLORATION AND PRODUCTION LLC                                                                                                                                                                                                                        003002394000000020891SUPERIOR OIL & GAS LLC                                                                                                                                                                                                                                           

This works fine:

DROP TABLE IF EXISTS `exp_gpexempt`; CREATE TABLE `exp_gpexempt` (   `_id` int(11) NOT NULL AUTO_INCREMENT,   `date_updated` datetime NOT NULL DEFAULT current_timestamp() ON UPDATE current_timestamp(),   `pun_county_num` varchar(100) DEFAULT '',   `pun_lease_num` varchar(100) DEFAULT '',   `pun_sub_num` varchar(100) DEFAULT '',   `pun_merge_num` varchar(100) DEFAULT '',   `exemption_type` varchar(100) DEFAULT '',   `code` varchar(100) DEFAULT '',   `exemption_percentage` varchar(100) DEFAULT '',   PRIMARY KEY (`_id`),   KEY `date_updated` (`date_updated`),   KEY `pun_county_num` (`pun_county_num`),   KEY `pun_lease_num` (`pun_lease_num`),   KEY `pun_merge_num` (`pun_merge_num`),   KEY `exemption_type` (`exemption_type`),   KEY `code` (`code`),   KEY `exemption_percentage` (`exemption_percentage`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1;  LOAD DATA LOCAL INFILE 'test2.dat'  INTO TABLE exp_gpexempt (@_row) SET `pun_county_num` = TRIM(SUBSTR(@_row,1,3)), `pun_lease_num` = TRIM(SUBSTR(@_row,4,6)), `pun_sub_num` = TRIM(SUBSTR(@_row,10,1)), `pun_merge_num` = TRIM(SUBSTR(@_row,11,4)), `exemption_type` = TRIM(SUBSTR(@_row,15,50)), `code` = TRIM(SUBSTR(@_row,65,5)), `exemption_percentage` = TRIM(SUBSTR(@_row,70,24)); 

Here is the content of the test2.dat file:

00300063900000School District                                   05   00000000000.000293000000 00300365500000State School Land Commission                      01   00000000000.125000000000 00301843300000State School Land Commission                      01   00000000000.125000000000 00302942700633State School Land Commission                      01   00000000000.125000000000 00302942800633State School Land Commission                      01   00000000000.125000000000 00303004100000Federal                                           02   00000000000.067632900000 00303004200000Federal                                           02   00000000000.125000000000 00303004600000Federal                                           02   00000000000.125000000000 00303004700000Federal                                           02   00000000000.125000000000 00303004800000Federal                                           02   00000000000.125000000000  

Knapsack with a fixed number of weights

Consider a special case of the knapsack problem in which all weights are integers, and the number of different weights is fixed. For example, the weight of every item is either 1k or 2k or 4k. There is one unit of each item.

The problem can be solved using dynamic programming. Suppose the knapsack capacity is $ C$ , and the most valuable item of weight $ w$ has a value of $ v_w$ . Then, the maximum value of KNAPSACK($ C$ ) is the maximum of the following three values:

KNAPSACK($ v_1$ ,$ C-1$ ), KNAPSACK($ v_2$ ,$ C-2$ ), KNAPSACK($ v_4$ ,$ C-4$ ).

Is there a more efficient algorithm? Particularly, is there a greedy algorithm for this problem?

I tried two greedy algorithms, but they fail already for weights 1 and 2. For example, suppose there are 3 items, with values 100, 99, 51 and weights 2, 1, 1:

  • If the capacity is 2, then the greedy algorithm that selects items by their value fails (it selects the 100 while the maximum is 99+51).
  • If the capacity is 3, then the greedy algorithm that selects items by their value/weight ratio fails (it selects the 99+51 while the maximum is 100+99).

However, this does not rule out the possibility that another greedy algorithm (sorting by some other criterion) can work. Is there a greedy algorithm for this problem? Alternatively, is there a proof that such an algorithm does not exist?

How i can perform most damage and mobility within this fixed classes?

My character is Air Genasi (+2 dex, -2 cha)

Classes: Fighter 2/Shadowcaster 1/Scout 1/Ardent 1;

He have next stats: STR 10 DEX 20 CON 14 INT 14 WIS 11 CHA 9

Feats: Point Blank Shot(PHB), Willing Deformity, Willing Deformity(Teeth), Weapon Finnesse

He nibbled off his hand, so he is totally one-handed and use rapier now, and off-hand 1d4 bite attack from willing deformity.

As shadowcaster (ToM book) he have a next spells: -arrow of dusk (2d4 nontlethal ray) -caul of shadow (+1 to AC deflection bonus) -steel shadows (+6 to AC)

Class was taken mostly for flavor in case of characters extraplanar heritage.

As ardent (CP book) he have next options: mantles: Freedom, Pain and Suffering; and power Dimension Hop which allows to 10 ft. move as swift action. +5 ft. for investing power points

As scout (CA book) he have a Skirmish ability (+1d6 to damage of all attacks during the round where character was moving at least 10 ft… +2d6 at 5-th lvl and so on, scalable damage)

Basic style is – 1. Dimension hop to enemy, and it triggers skirmish (because to trigger you must move, dimension hop description says that you instantly move, not teleproting, transfering or other) 2. Full attack with rapier and bite (1d6+1d6, 1d4+1d6) with +7(rap)/+2(bite) attack modifiers.

+5 dexterity bonus, and Steel Shadows help me to keep high AC. Also my character good at intimidation, stealth move and hiding. Potentially i can reach hustle (from Freedom mantle) and Flicker (from Shadowcaster class), to keep me mobile

I’m also thinking about Arcane Strike feat, but not sure.

So, questions is – what i can do, to improve basic style? Maybe there is can be some another combos, much devastating? In what class from the Ardent\Scout\Shadowcaster triad is better to progress first? second?

Books of D&D 3.5:

Player’s Handbook (PHB) Tome of Magic (ToM) Complete Adventurer (CA) Complete Psionic (CP) Expanded Psionic (EP)

How to design a chart that has an x-axis with a fixed range that is longer than the selected range?

I’m designing a web-based analytics dashboard that contains a chart in which the x-axis represents the days of a campaign, and it always shows 31 days. However, the duration of a campaign varies; it may be predefined to any period between a single to 31 days.
In cases when the campaign’s duration is shorter than 31 days, I want to make it clear which area of the chart is relevant.
Are there any existing best practice solutions for this scenario?
One idea I have is graying out the area that’s not required (see image below). A bar chart with a 31-days x-axis that shows a 7-days long campaign

The chart’s width is fixed and optimized for displaying a 31-days range. I’m not changing the number of days on the x-axis to avoid scenarios in which there is a single floating dot or bar on the screen or several data points with vast space between them (see images below). A bar chart with a 7-days x-axis that shows a 7-days long campaign A bar chart with a 3-days x-axis that shows a 3-days long campaign

Allow a fixed Two rows to display product title

Hi,

Please, How to make the product titles AUTOMATICALLY displayed in Woocommerce archive (home & categories) on TWO lines (around 10 words).

The site is not live yet.

I join two pdf files of the Woocommerce archive (before and after). The after one is a photo editing to illustrate.

If you can see I expect:

* To set a permanent and fixed two lines' space for the products' title.
* If the title exceeds Two lines, the 'read more' should be displayed like this: (…)
* Although…

Allow a fixed Two rows to display product title

Jumping to anchor within modal with a fixed header

Hi all,
I am having a bit of a problem.
I have a script that is for nested modals, that when a link is clicked it jumps straight to an anchor. It works fine.
However when styling, I would like to have a fixed header as the ".content" div scrolls to the anchor.
When ".content" has a height of 80vh it looks fine, but the script doesn't scroll to the anchor.
When ".content" has a height of 80% the script works, but the fixed header doesn't stay fixed and scrolls with the ".content".
How can I…

Jumping to anchor within modal with a fixed header

Compiz: Fixed window placement, which alternative?

At login two windows are automagically opened. For the purpose of fixed location of these, I am trying CompizConfiguration Settings Manager, function Place Windows. The Fixed Windows Placement tab gives me three choices: Windows with fixed position, Windows with fixed placement mode, and Windows with fixed viewport.

What is the difference between these alternatives? I have tried to find an explanation, but miserably failed. Any light on the matter is gratefully received.

See also 1.