LIMIT and OFFSET a complex query

I have a query generated by a ORM (Sequelize), i ran into an issue where sequelize fails, look at those issues. https://github.com/sequelize/sequelize/issues/7344

https://github.com/sequelize/sequelize/issues/12200

Postgres Query:

 SELECT "feeds"."id",     "feeds"."title",     "feeds"."likes",     "feeds"."description",     "feeds"."files",     "feeds"."allowComments",     "feeds"."readConfirmation",     "feeds"."isDraft",     "feeds"."createdAt",     "feeds"."updatedAt",     "feeds"."companyId",     "feeds"."createdById",     "reads"."id" AS "reads.id",     "reads->feeds_reads"."createdAt" AS "reads.feeds_reads.createdAt",     "reads->feeds_reads"."updatedAt" AS "reads.feeds_reads.updatedAt",     "reads->feeds_reads"."feedId" AS "reads.feeds_reads.feedId",     "reads->feeds_reads"."userId" AS "reads.feeds_reads.userId",     "createdBy"."id" AS "createdBy.id",     "createdBy"."firstName" AS "createdBy.firstName",     "createdBy"."jobTitle" AS "createdBy.jobTitle",     "createdBy"."lastName" AS "createdBy.lastName",     "createdBy"."profilePicture" AS "createdBy.profilePicture",     "bookmarks"."id" AS "bookmarks.id",     "bookmarks->feeds_bookmarks"."createdAt" AS "bookmarks.feeds_bookmarks.createdAt",     "bookmarks->feeds_bookmarks"."updatedAt" AS "bookmarks.feeds_bookmarks.updatedAt",     "bookmarks->feeds_bookmarks"."feedId" AS "bookmarks.feeds_bookmarks.feedId",     "bookmarks->feeds_bookmarks"."userId" AS "bookmarks.feeds_bookmarks.userId",     "units"."id" AS "units.id",     "units"."parentId" AS "units.parentId",     "units->feeds_units"."createdAt" AS "units.feeds_units.createdAt",     "units->feeds_units"."updatedAt" AS "units.feeds_units.updatedAt",     "units->feeds_units"."feedId" AS "units.feeds_units.feedId",     "units->feeds_units"."unitId" AS "units.feeds_units.unitId",     "units->users"."id" AS "units.users.id",     "units->users->users_units"."createdAt" AS "units.users.users_units.createdAt",     "units->users->users_units"."updatedAt" AS "units.users.users_units.updatedAt",     "units->users->users_units"."userId" AS "units.users.users_units.userId",     "units->users->users_units"."unitId" AS "units.users.users_units.unitId",     "units->descendents"."id" AS "units.descendents.id",     "units->descendents"."parentId" AS "units.descendents.parentId",     "units->descendents->unitsancestor"."unitsId" AS "units.descendents.unitsancestor.unitsId",     "units->descendents->unitsancestor"."ancestorId" AS "units.descendents.unitsancestor.ancestorId",     "units->descendents->users"."id" AS "units.descendents.users.id",     "units->descendents->users->users_units"."createdAt" AS "units.descendents.users.users_units.createdAt",     "units->descendents->users->users_units"."updatedAt" AS "units.descendents.users.users_units.updatedAt",     "units->descendents->users->users_units"."userId" AS "units.descendents.users.users_units.userId",     "units->descendents->users->users_units"."unitId" AS "units.descendents.users.users_units.unitId",     "teams"."id" AS "teams.id",     "teams->feeds_teams"."createdAt" AS "teams.feeds_teams.createdAt",     "teams->feeds_teams"."updatedAt" AS "teams.feeds_teams.updatedAt",     "teams->feeds_teams"."feedId" AS "teams.feeds_teams.feedId",     "teams->feeds_teams"."teamId" AS "teams.feeds_teams.teamId",     "teams->peoples->teams_users"."createdAt" AS "teams.peoples.teams_users.createdAt",     "teams->peoples->teams_users"."updatedAt" AS "teams.peoples.teams_users.updatedAt",     "teams->peoples->teams_users"."userId" AS "teams.peoples.teams_users.userId",     "teams->peoples->teams_users"."teamId" AS "teams.peoples.teams_users.teamId",     "comments"."text" AS "comments.text",     "comments"."id" AS "comments.id",     "comments"."likes" AS "comments.likes",     "comments"."parentId" AS "comments.parentId",     "comments"."createdById" AS "comments.createdById",     "comments"."createdAt" AS "comments.createdAt",     "comments"."updatedAt" AS "comments.updatedAt",     "comments->createdBy"."id" AS "comments.createdBy.id",     "comments->createdBy"."firstName" AS "comments.createdBy.firstName",     "comments->createdBy"."lastName" AS "comments.createdBy.lastName",     "comments->createdBy"."jobTitle" AS "comments.createdBy.jobTitle",     "comments->createdBy"."profilePicture" AS "comments.createdBy.profilePicture",     "peoples->feeds_peoples"."createdAt" AS "peoples.feeds_peoples.createdAt",     "peoples->feeds_peoples"."updatedAt" AS "peoples.feeds_peoples.updatedAt",     "peoples->feeds_peoples"."feedId" AS "peoples.feeds_peoples.feedId",     "peoples->feeds_peoples"."userId" AS "peoples.feeds_peoples.userId" FROM "feeds" AS "feeds"     LEFT OUTER JOIN (         "feeds_reads" AS "reads->feeds_reads"         INNER JOIN "users" AS "reads" ON "reads"."id" = "reads->feeds_reads"."userId"     ) ON "feeds"."id" = "reads->feeds_reads"."feedId"     LEFT OUTER JOIN "users" AS "createdBy" ON "feeds"."createdById" = "createdBy"."id"     LEFT OUTER JOIN (         "feeds_bookmarks" AS "bookmarks->feeds_bookmarks"         INNER JOIN "users" AS "bookmarks" ON "bookmarks"."id" = "bookmarks->feeds_bookmarks"."userId"     ) ON "feeds"."id" = "bookmarks->feeds_bookmarks"."feedId"     LEFT OUTER JOIN (         "feeds_units" AS "units->feeds_units"         INNER JOIN "units" AS "units" ON "units"."id" = "units->feeds_units"."unitId"     ) ON "feeds"."id" = "units->feeds_units"."feedId"     LEFT OUTER JOIN (         "users_units" AS "units->users->users_units"        LEFT OUTER JOIN "users" AS "units->users" ON "units->users"."id" = "units->users->users_units"."userId"     ) ON "units"."id" = "units->users->users_units"."unitId"     LEFT OUTER JOIN (         "unitsancestor" AS "units->descendents->unitsancestor"         LEFT OUTER JOIN "units" AS "units->descendents" ON "units->descendents"."id" = "units->descendents->unitsancestor"."unitsId"     ) ON "units"."id" = "units->descendents->unitsancestor"."ancestorId"     LEFT OUTER JOIN (         "users_units" AS "units->descendents->users->users_units"         LEFT OUTER JOIN "users" AS "units->descendents->users" ON "units->descendents->users"."id" = "units->descendents->users->users_units"."userId"     ) ON "units->descendents"."id" = "units->descendents->users->users_units"."unitId"     LEFT OUTER JOIN (         "feeds_teams" AS "teams->feeds_teams"         INNER JOIN "teams" AS "teams" ON "teams"."id" = "teams->feeds_teams"."teamId"     ) ON "feeds"."id" = "teams->feeds_teams"."feedId"     LEFT OUTER JOIN (         "teams_users" AS "teams->peoples->teams_users"         INNER JOIN "users" AS "teams->peoples" ON "teams->peoples"."id" = "teams->peoples->teams_users"."userId"     ) ON "teams"."id" = "teams->peoples->teams_users"."teamId"     LEFT OUTER JOIN "comments" AS "comments" ON "feeds"."id" = "comments"."feedId"     LEFT OUTER JOIN "users" AS "comments->createdBy" ON "comments"."createdById" = "comments->createdBy"."id"     LEFT OUTER JOIN (         "feeds_peoples" AS "peoples->feeds_peoples"         INNER JOIN "users" AS "peoples" ON "peoples"."id" = "peoples->feeds_peoples"."userId"     ) ON "feeds"."id" = "peoples->feeds_peoples"."feedId" WHERE (         "peoples"."id" = 11         OR "feeds"."createdById" = 11         OR "teams->peoples"."id" = 11         OR "units->users"."id" = 11         OR "units->descendents->users"."id" = 11     )     AND "feeds"."companyId" = 4     AND "feeds"."isDraft" = false     AND "feeds"."createdAt" < '2020-12-09 12:59:34.017 +00:00' LIMIT 20; 

Here the limit is not applying to the feeds, i want 20 feeds but it’s giving me same feed 20 times.

How to fetch posts with offset in WordPress?

I have more than 100,000 posts and i fetch 20,000 posts in every page refresh, but now if i refresh the page i want to display new 20,000 posts rather than same.

Here is my Query

         $  args = array(             'post_type' => 'ebay_product',             'posts_per_page' => 20000,             'post_status' => array('publish'),             'orderby' => 'ID',             'order' => 'DESC'         ); 

Please help me to solve this issue. Any solution appreciated!

Enemy explosion offSet is displaced away from desired point of origin

In the game, when the enemy is hit and explodes the explosion animation appears offset to the bottom right corner of the dead enemy’s center point.

I ran tests to adjust the explosion positioning with no luck. The debugger says that “this.offSet.y” & “this.offSet.x” are -100. I can’t seem to know how to adjust that as well.

Image example:

enter image description here

Here are the code snippets & CSS for bonus:

Explosion JS

class Explosions {     constructor (assetName) {         this.count = 0;         this.offSet = undefined;         this.setOffSet(assetName);      }      setOffSet(assetName) {         let asset = GameManager.assets[assetName];         this.offSet = new Point ((asset.width/2)*-1, (asset.height/2)*-1);     }  createExplosion(position) {     let div = document.createElement("div");         div.classList.add("explosion");         let divId = 'explosion_' + this.count;         div.id = divId;         console.log(position);         div.style.left = (position.x + this.offSet.x) + 'px';         div.style.top = (position.y  + this.offSet.y) + 'px';         $  (GameSettings.gameAreaDiv).append(div);         setTimeout(function() {             $  ('#' + divId).remove();         }, GameSettings.explosionTimeout);          this.count++; }  } 

Explosion CSS

@keyframes explosion {     0%   {background-image: url("../../assets/explosion/smallexplode1.png");}     10%   {background-image: url("../../assets/explosion/smallexplode2.png"); }     20%   {background-image: url("../../assets/explosion/smallexplode3.png"); }     30%   {background-image: url("../../assets/explosion/smallexplode4.png"); }     40%   {background-image: url("../../assets/explosion/smallexplode5.png");}     60%   {background-image: url("../../assets/explosion/smallexplode6.png");opacity: 0.9;}     80%   {background-image: url("../../assets/explosion/smallexplode7.png");opacity: 0.8;}     90%   {background-image: url("../../assets/explosion/smallexplode8.png");opacity: 0.5;}     100%   {background-image: url("../../assets/explosion/smallexplode9.png");opacity: 0.3;} }  .explosion {   width: 100px;   height: 100px;   position: absolute;   left: 500px;   top: 300px;   animation-name: explosion;   animation-duration: 0.8s;   background-repeat: no-repeat;   z-index: 20; } 

Any indication what the cause is?

Let me know if there are any other snippets that needs adding, and if you can please be patient with me since I’m student programmer that is still learning.

Thank you

How to find the size of offset w an the maximum number of pages per segment?

I am learning about Operating Systems, but the topic of physical and logical address translating seems a bit complicated to me.

So I am asked to find the size of offset w and the maximum number of pages per each segment, while I know that:

  • The logical address size is 16 bits,
  • Page size is 256 words,
  • and the segment table contains 2^10 entries.

HLSL : Offset and length were out of bounds

//Vertex Shader Constants  float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldViewIT;  //Color Texture  texture Texture;  //Normal Texture texture NormalMap;  //Specular Texture  texture SpecularMap;  //Albedo Sampler  sampler AlbedoSampler = sampler_state  {      texture = <Texture>;     MINFILTER = LINEAR;       MAGFILTER = LINEAR;       MIPFILTER = LINEAR;       ADDRESSU = WRAP;       ADDRESSV = WRAP;  };  //NormalMap Sampler  sampler NormalSampler = sampler_state  { texture = <NormalMap>;   MINFILTER = LINEAR;   MAGFILTER = LINEAR;   MIPFILTER = LINEAR;   ADDRESSU = WRAP;   ADDRESSV = WRAP;  };  //SpecularMap Sampler sampler SpecularSampler = sampler_state {  texture = <SpecularMap>;   MINFILTER = LINEAR;   MAGFILTER = LINEAR;   MIPFILTER = LINEAR;   ADDRESSU = WRAP;   ADDRESSV = WRAP;  };  //Vertex Input Structure  struct VSI {  float4 Position : POSITION0;   float3 Normal : NORMAL0;   float2 UV : TEXCOORD0;   float3 Tangent : TANGENT0;   float3 BiTangent : BINORMAL0; };  //Vertex Output Structure  struct VSO {  float4 Position : POSITION0;  float2 UV : TEXCOORD0;   float3 Depth : TEXCOORD1; float3x3 TBN : TEXCOORD2; };  //Vertex Shader  VSO VS(VSI input) { //Initialize Output  VSO output;  //Transform Position   float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); //Pass Depth   output.Depth.x = output.Position.z; output.Depth.y = output.Position.w; output.Depth.z = viewPosition.z;  //Build TBN Matrix   output.TBN[0] = normalize(mul(input.Tangent, (float3x3)WorldViewIT)); output.TBN[1] = normalize(mul(input.BiTangent, (float3x3)WorldViewIT)); output.TBN[2] = normalize(mul(input.Normal, (float3x3)WorldViewIT));  //Pass UV   output.UV = input.UV;  //Return Output   return output; }  //Pixel Output Structure  struct PSO { float4 Albedo : COLOR0; float4 Normals : COLOR1; float4 Depth : COLOR2; };  //Normal Encoding Function  half3 encode(half3 n) { n = normalize(n);  n.xyz = 0.5f * (n.xyz + 1.0f);  return n; }  //Normal Decoding Function  half3 decode(half4 enc) { return (2.0f * enc.xyz - 1.0f); }    //Pixel Shader  PSO PS(VSO input) {  //Initialize Output   PSO output;  //Pass Albedo from Texture   output.Albedo = tex2D(AlbedoSampler, input.UV); //Pass Extra - Can be whatever you want, in this case will be a Specular Value   output.Albedo.w = tex2D(SpecularSampler, input.UV).x;  //Read Normal From Texture   half3 normal = tex2D(NormalSampler, input.UV).xyz * 2.0f - 1.0f;  //Transform Normal to WorldViewSpace from TangentSpace  normal = normalize(mul(normal, input.TBN));  //Pass Encoded Normal  output.Normals.xyz = encode(normal); //Pass this instead to disable normal mapping  //output.Normals.xyz = encode(normalize(input.TBN[2]));    //Pass Extra - Can be whatever you want, in this case will be a Specular Value  output.Normals.w = tex2D(SpecularSampler, input.UV).y;  //Pass Depth(Screen Space, for lighting)  output.Depth = input.Depth.x / input.Depth.y;  //Pass Depth(View Space, for SSAO)  output.Depth.g = input.Depth.z;  //Return Output   return output; }  //Technique  technique Default { pass p0 {     VertexShader = compile vs_3_0 VS();     PixelShader = compile ps_3_0 PS(); } } 

This is my current shader… it is producing a “Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection” argumente axeception… I don’t understand what’s wrong with it….

Any help is appreciated.

In 5e, how can I offset the disadvantage created by not having a “big three” save proficiency?

I made a homebrew class. (I’m happy to share any details anyone deems necessary or even the entire class if it isn’t against the rules.) It’s a cha-based class which, for flavour reasons, has a variable secondary ability (depends on subclass). One subclass uses int, another uses str and the other uses wis.

It’s been brought to my attention that two of the subclasses suffer from not being proficient in any of the “big three” saves (dex, con, wis) while all 5e classes get one “big three” save and one of the other saves and therefore my class is at a disadvantage defence-wise 2/3 of the time.

The class gets a feature at 11th level that allows it to gain advantage on saving throws with one ability of its choice for a while, but it’s a once-per-short-rest thing and it doesn’t have access to that from 1st level to 10th level.

For flavour reasons, the secondary abilities and save proficiencies of my class’s subclasses absolutely cannot be changed; those two subclasses are stuck with not having any “big three” save proficiencies. How else could I offset the disadvantage this creates?

I’ve thought about bumping the save-advantage feature down from 11th level to 5th level and relying on low-CR monsters not having too many AOE/save attacks, but that seems like a really powerful feature for 5th level and I’m not entirely convinced by the idea.

Direct and Associate Cache – Offset, Index, and Tag

I have two questions:

An 8-kB (8192 bytes) direct-mapped cache has 16-byte lines.  The system has 64-bit addresses numbered from 0 on the right to 63 on the left.  Which bits are associated with the offset, index, and tag? 
A 16-kB (16384 bytes) 4-way set associative cache has 8-byte lines.  The system has 64-bit addresses numbered from 0 on the right to 63 on the left. Which bits are associated with the offset, index, and tag? 

For offset, it’s: tag bits = address bit length – exponent of index – exponent of offset, correct?

Then the Index for a direct mapped cache is the number of blocks in the cache, and the Tag bits are everything else, right?

How would I calculate these? Because I’m a little confused on an associated cache vs a direct-map cache.

How can i find the offset rotation and position between two objects in 3D

I have two objects in 3 dimensions, one of them (Child) is attached to the other one (Parent) using the offset between them (XYZ – position and rotation). Using the Vector3 library, how can I find that Child offsets from the Parent?

further explination: let’s say we have the parent at position X:0 Y:0 Z:0 with rotations RX:0 RY:0 RZ:0, and there is an attached Child to it with the offset X:10 Y:10 Z:10, RX:90 RY:45 RZ:90 (which is what I want to know). Supposedly, I had the Parent with a different position and rotation, how can I get the offset between it and the Child (which in this case, it should be X:10 Y:10 Z:10, RX:90 RY:45 RZ:90) (in another word, I want the Child position and rotation as the Parent is the center of the map)

I’ve tried simple math and some other ways on the internet but with no success, since I’m pretty bad at calculating rotations (since vector3 libraries are pretty identical, so as Math libraries, any language should be fine).