Getting artifacts with XP and crafting artifacts with iotum in Numenera

Numenera Destiny introduces crafting artifacts. Numenera Discovery allows one to spend 3 XP to obtain an artifact. Are these two statements correct?

  • By Destiny rules, crafting an artifact does not take experience, only taking iotum, parts, and time.
  • Through Discovery rules, player can spend 3 XP to instantly-ish gain an artifact, no iotum/parts expended.

I saw this, but it covers Numenera 1 rules and does not touch on Discovery/Destiny.

Artifacts on top of platform sprite only when moving

I made a floating platform in Unity which has a propeller movement animation. I attached a script I wrote to the game object that adds slight vertical and horizontal movement using the game object transform.position. This is creating some weird artifacts at the top of the sprite, but only when the movement code is active. When I don’t move the platform and just leave it still to animate there are no artifacts.

Gameplay:

gif of gameplay

Sprite Import Settings:

sprite import settings

Sprite Renderer:

sprite renderer settings

Camera Settings:

camera settings

Resolution:

resolution settings

Finding artifacts using bulk extractor in different methods

I have specific number of artifact, and I want to search for those particular artifacts in the disk image using bulk extractor via different methods using Random Sampling, Stop List, Alert List , or Search Text.I want to know which method is most efficient, and How can I reproduce such test scenario and get results.

Sprite Distortion (Ghosting While Moving and Artifacts when Mirroring to External Display)

I am working on my first MonoGame project. I love the framework so far!

I have implemented my own letterbox/pillarboxing to scale my native resolution by the maximum integer scale allowable on my display. Basically, I determine the maximum integer scale, set my PreferredBackBuffer to the screen resolution, create a Viewport that is my native resolution * maximum scale, and then set my SpriteBatch to draw everything at Matrix.CreateScale(max_scale).

This works much better than rendering to a texture that has my native resolution and then scaling it up. (Scaling using a Matrix in SpriteBatch, as opposed to just rendering to a texture and then scaling it up, allows you to fake “subpixel rendering” to some extent).

That said, I am facing two issues.

  1. When my sprite moves, there is very subtle “ghosting” happening. The sprite is subtly blurry and there’s a faint ghostly trail behind it.

  2. When I mirror to an external monitor, there is less ghosting, but there is ugly artifacting on the outside of sprite when stationary. See below.

enter image description here

Rounding my player’s position to integers in the Draw() command doesn’t help for either problem.

Does anyone have any thoughts about how to fix these issues?

Here is my game class:

using System; using Microsoft.Xna.Framework; using Microsoft.Xna.Framework.Graphics; using Microsoft.Xna.Framework.Input;  namespace MyMonoGame {     public class MyGame : Game     {         // declare variables         GraphicsDeviceManager graphics;         SpriteBatch sprite_batch;          // resolution management         int native_width;         int native_height;         int screen_width;         int screen_height;         int max_scale;         int horizontal_margin;         int vertical_margin;          // objects         Player player;          public MyGame()         {             // create GraphicsDeviceManager instance             graphics = new GraphicsDeviceManager(this);             // specify root directory             Content.RootDirectory = "Content";         }          protected override void Initialize()         {             // set window title             this.Window.Title = "My Game";              // create SpriteBatch instance, which can be used to draw textures.             sprite_batch = new SpriteBatch(GraphicsDevice);              // initialize some variables             native_width = 160;             native_height = 144;              // resolution management             // get screen size             screen_width = GraphicsAdapter.DefaultAdapter.CurrentDisplayMode.Width;             screen_height = GraphicsAdapter.DefaultAdapter.CurrentDisplayMode.Height;             // get max_scale, the maximum integer scale that will fit on the screen             // note: must be integer to prevent pixel distortion             int width_divisor = (int) Math.Floor((float)screen_width/(float)native_width);             int height_divisor = (int) Math.Floor((float)screen_height/(float)native_height);             max_scale = Math.Min(width_divisor, height_divisor);             // get margins for letterboxing and pillarboxing             int max_width = native_width * max_scale;             int max_height = native_height * max_scale;             horizontal_margin = (int)((screen_width - max_width)/2f);             vertical_margin = (int)((screen_height - max_height)/2f);              // toggle fullscreen             graphics.PreferredBackBufferWidth = screen_width;             graphics.PreferredBackBufferHeight = screen_height;             graphics.ToggleFullScreen();             GraphicsDevice.Viewport = new Viewport(horizontal_margin, vertical_margin, native_width * max_scale, native_height * max_scale);             graphics.ApplyChanges();              // objects             player = new Player(this);         }          protected override void LoadContent()         {         }          protected override void UnloadContent()         {         }          protected override void Update(GameTime gameTime)         {             if (Keyboard.GetState().IsKeyDown(Keys.Escape))                 Exit();              // update objects             player.Update(gameTime);         }          protected override void Draw(GameTime gameTime)         {             // clear window & fill with solid color             GraphicsDevice.Clear(Color.DarkRed);              // draw objects             var transform_matrix = Matrix.CreateScale(max_scale);             sprite_batch.Begin(SpriteSortMode.Deferred, BlendState.AlphaBlend, SamplerState.PointClamp, DepthStencilState.None, RasterizerState.CullCounterClockwise, transformMatrix: transform_matrix);             player.Draw(sprite_batch);             sprite_batch.End();         }     } } 

Artifacts at seams between meshes in Unity in isometric camera only

I’m trying to piece together some very simple placeholder models that I intend to use as tiles in Unity and I’m seeing odd pixel artifacts along the seams of the tiles that only show up when using an isometric camera. You can see them in this image.

Example of isometric artifacts

And the same geometry but just from a perspective camera shows no artifacts.

Example of perspective with no artifacts

I verified that the two models are exactly aligned right along the seam by checking the actual vertex data. The artifacts are dependent on the camera itself and shift along the seam as the camera is panned and zoomed.

I’ve disabled shadows, set the texture filter mode to point, disabled mipmap generation

The geometry is quite simple and is imported from the OBJ file below.

# normals vn -1 0 0 vn 1 0 0 vn 0 0 1 vn 0 0 -1 vn 0 -1 0 vn 0 1 0  # texcoords vt 0.970703 0.5 vt 0.974609 0.5  # verts v 0 2 0 v 0 2 -4 v 0 2.3 0 v 0 2.3 -4 v 0 2.5 0 v 0 2.5 -4 v 4 2 0 v 4 2 -4 v 4 2.3 0 v 4 2.3 -4 v 4 2.5 0 v 4 2.5 -4  # faces f 3/2/1 2/2/1 1/2/1 f 4/2/1 2/2/1 3/2/1 f 5/1/1 4/1/1 3/1/1 f 6/1/1 4/1/1 5/1/1 f 7/2/2 8/2/2 9/2/2 f 9/2/2 8/2/2 10/2/2 f 9/1/2 10/1/2 11/1/2 f 11/1/2 10/1/2 12/1/2 f 7/2/3 3/2/3 1/2/3 f 9/1/3 5/1/3 3/1/3 f 9/2/3 3/2/3 7/2/3 f 11/1/3 5/1/3 9/1/3 f 2/2/4 4/2/4 8/2/4 f 4/1/4 6/1/4 10/1/4 f 8/2/4 4/2/4 10/2/4 f 10/1/4 6/1/4 12/1/4 f 2/2/5 7/2/5 1/2/5 f 8/2/5 7/2/5 2/2/5 f 5/1/6 11/1/6 6/1/6 f 6/1/6 11/1/6 12/1/6 

With the following texture applied as Albedo on a default Unity material (a bit odd since it was originally generated in MagicaVoxel)

Texture

I’m really at a loss for what could be causing these to show up. Only spotted them because I was testing an outline shader and it was outlining all the artifacts as the normals on those pixels were odd. With a pixel shader set to display _CameraNormalsTexture instead of the color the artifacts are still visible as variances in the normals as you can see in the image below.

Same image but of camera normals

How to get rid of clipping artifacts?

The following is a MWE extracted from a more complex graphic.

vertex = {8 Sqrt[2/((5 - Sqrt[5]) (10 + 2 Sqrt[5]))], 0, 8/   Sqrt[(10 - 2 Sqrt[5]) (10 + 2 Sqrt[5])]};  example[r_] :=   Show[Graphics3D[{Opacity[0.3, GrayLevel[0.8]], Sphere[{0, 0, 0}, r],      GrayLevel[0.2], Opacity[1],      Style[Sphere[{0, 0, 0}, 3],       ClipPlanes ->        Hyperplane[vertex, Cos[30 Degree]*3*Normalize[vertex]]]},     Boxed -> False], Lighting -> "Neutral",    ViewPoint -> {1.7, -2.7, 1}, ViewVertical -> {0, 0, 1},    ImageSize -> {1000, Automatic}]  Export["ex1.png", example[3]] Export["ex2.png", example[3.001]] 

First example

Second example

The second example is a bit better than the first one, but both show artifacts that I’d like to get rid of. (To clarify what I’m talking about: In both cases, the boundary of the spherical cap is clearly some $ n$ -gon and not a circle. And Example[2.999] would have the same effect.)

How can I improve this? Is there a way to increase the mesh granularity of the sphere? Or is there a better way to create the spherical cap?

Artifacts when returning from suspend

Sometimes, when i return from a “Suspended” state, I can see some artifacts in random areas of some application windows.

enter image description here

They are quite annoying and sometimes they make the application totally unusable, to the point I have to close it and opening again. In some cases like the “switch user” screen, make it simply unusuable.

What are those things? Why do they appear? How can I avoid them?

How to manage versioning of different artifacts when using Maven and GitlabCI?

I have been using Maven for almost a year and have recently started using GitLab CICD. I am trying to develop CIs for the build and deployment of microservices. Each microservice is packed in a docker container and stored in Nexus. We will be using Kubernetes to manage the distribution of the services to test and production environments.

We are a small team of 3 developers and we currently have around 10 services. Both of these numbers will increase so it is important that we find a way to describe the workflow and to automate it as much as possible.

Reading around the subject, some have suggested that each build should be seen as a potential release. I understand this to mean that there should only be one build and this build gets retagged as it is promoted towards a release.

My difficulty here is regarding the use of SNAPSHOT and version numbers in the project POMs.

My current plan:

  • use feature-branches while implementing new features
  • when complete, features will be merged with the development branch which causes a full build and test.
  • the resulting docker image will be stored in Nexus as x.y.z-$ {COMMIT_ID}
  • a release-candidate will be merged to releases/x.y.z triggering a retag of the docker image in Nexus to x.y.z-rc
  • a release will be merged from releases/x.y.z to the master branch triggering a retag of the docker image in Nexus to x.y.z

(Deployment to actual servers will be handled by Kubernetes)

Do I need to use SNAPSHOT at all with this kind of workflow? The microservice projects don’t have build-dependencies on each other so I can’t immediately see the point of adding SNAPSHOT to the POM version numbers.

Can any one comment on this? Are there any well-documented workflows for this type of project?

Continuous Integration of dependent Development Artifacts

I have transitioned companies, and so I, who have programmed in Python for the last 7 years work now with a Java stack. I wanted to improve the continous integration setup of the project I joined.

Setting

Continuous Delivery Stages, taken from http://ptgmedia.pearsoncmg.com/images/chap5_9780321601919/elementLinks/fig5_4.jpg

So the general idea is if your software project consists of several *.jar files (the software artifacts), you build them at commit stage and put them onto an artifact store, then you can load them from an artifact store and integrate them. Continuously means, that this integration / acceptance tests are triggered with every triggered artifact upload, and artifact uploads are triggered at every push to the source code repositories of the involved software projects.

Question

To me it is unclear, how I can upload artifacts to an artifact store like artifactory on every push since in the maven files, the version is hardcoded into the pom.xml.

In Python I used a package management plugin that would generate a unique version number for development builds using the latest git tag and an incremental number of commits, which meant that in the acceptance test stage, I could just pull the most recent released artifacts from an artifact store.

In java / maven I don’t know how to do this and I haven’t found much so far.