What is the least intrusive way to make tiny changes in 2.3.1?

While I wasn’t terrible with Magento 1, M2 has such changes that I’m not sure where to start. And everyone talking about “plugins” simply copy and paste the developer’s guide and no one gives example or explains what the whys or hows.

What are the ways to make small UI improvements for these examples (extension or plugin or what is “an extension with a plugin”):

  1. In admin area, the bar appears encouraging you to update the cache. When I go to the page, it already knows which need updating, so I want to make those already selected.
  2. Remove the “Select All” option in drop-downs?
  3. Change from name in the order and contact Emails?
  4. Copy the comments block in adminhtml order page to checkout?

Everyone wants to make themes and grandiose projects, where I’m more about the smallest amount of changes and impact on the code.

Postfix generic changes causing DKIM permerror

We have modifed the Return-Path: and Reply-To: sections of our email by doing the following:

/etc/postfix/main.cf

smtp_generic_maps = hash:/etc/postfix/generic 

/etc/postfix/generic

bounce@mainserver.com       bounce@relay.com admin@mainserver.com       admin@relay.com 

before adding this we get

Return-Path: <bounce@mainserver.com> Authentication-Results: mta1087.mail.ir2.yahoo.com   header.i=@relay.com header.s=mail dkim=pass (ok);  spfDomain=mainserver.com spfResult=none;  dmarc=pass(p=none sp=quarantine dis=none) header.from=relay.com Received-SPF: none (domain of mainserver.com does not designate permitted sender hosts) 

after:

Return-Path: <bounce@relay.com> Authentication-Results: mta1129.mail.ir2.yahoo.com   header.i=@relay.com header.s=mail dkim=permerror (bad sig);  spfDomain=relay.com spfResult=pass;  dmarc=pass(p=none sp=quarantine dis=none) header.from=relay.com Received-SPF: pass (domain of relay.com designates 51.89.165.124 as permitted sender) 

This change is causing our DKIM to get permerror (bad sig) is there a fix for this or another approach?

How do I sync my Dataset in Visual Studio with SQL Server Datatype Changes without losing custom queries?

I know that similar questions have been asked, but I haven’t been able to get an exact answer.

I have a Dataset.xsd item in Visual studio that is based on SQL Server tables. If make lots of SQL Server column changes (data length, null to not null, new columns, varchar to nvarchar), how can I run the Custom Tool without it potentially wiping out my TableAdapter’s existing custom queries? Or is there another way to sync them? I can manually modify the table columns in the XSD file, but it’s getting tedious if I have hundreds to make, And I am also afraid I might miss one or two.

BTW, I am currently using VS2017, SQL 2016

Save As Template button, if application criteria changes, should the changes also be saved on Save As Template click?

Usability Issue I am running into with team:

When in the middle of an application, there is a Save as Template button that, when clicked, opens a modal to name the template, then once template is saved, returns you back to the previous application you were working on, not the one you just created.

My question is, if you change the criteria in Step 1 without hitting the Save button below on that page, and click “Save as Template”, should the edited areas be saved with that template creation? Or should there be a modal telling the User they need to save first? Do you see the button being named incorrectly?

example
(source: bakerdesign.ca)

How to make Drupal 8 aware of entity schema changes that are managed external to Drupal

Though it’s ideal to centralize entity and schema management together within Drupal, there are possible use cases where the two must be managed independently:

  1. Drupal entities that are based on tables in an external DB
  2. Manual schema updates for tables with exiting data that are considered unsafe from a cross-DB support perspective, but may otherwise be safe in controlled situations (e.g. increase varchar length on mySQL).

Our case is specific to #1. We have Drupal entities based off tables in an external DB (using base_table = "dbname.tablename" convention), and the related schema is managed exclusively by another platform. This means that when schema changes are deployed externally we need to update the relevant entity definitions and installed schema definitions in Drupal without triggering the actual schema updates themselves (e.g. no CREATE/DROP/ALTER TABLE... calls). The best way to tackle this problem has always been a bit unclear, and some of the new constraints introduced in 8.7.x seem to cloud the matter even further.

It seems that there are several Drupal-specific layers that need to be navigated here, including the entity definition, the entity storage definition, the last installed schema definition, the actual schema management API and various caches. The interplay between these layers had been difficult to pin down.

Possible Approaches

Some solutions proposed in forums involve manual manipulation of each API layer and often ad-hoc workarounds (direct updates to key_value DB records) that seem quite questionable. Examples are here, here and the “Updating an existing entity type” notes here. There also appears to be some discussion around overall limitations in field-specific schema updates here.

Some relief may come in the form of the EntityDefinitionUpdateManagerInterface which seems to serve as a central place to abstract these various layers away, but it’s not clear which methods operate purely on storage/schema definitions and not the actual DB schema itself. For example, I have had some success (re)using the ::installEntityType() method to deploy changes after externally updating the DB schema, updating the related Drupal entity definition and then running:

$  entity_type = \Drupal::entityTypeManager()->getDefinition('my_custom_entity_type'); $  entity_definition_update_manager = \Drupal::entityDefinitionUpdateManager(); $  entity_definition_update_manager->installEntityType($  entity_type); 

This seems to update the related entity/field storage definitions and the schema definitions and does not seem to alter the DB schema (I guess because the related table is detected as already existing). But given that this method is documented for once-off use (not ongoing re-install use) this feels like “off-label” use with unpredictable side-effects.

I also suppose that a custom entity storage definition could be justified to deal with all this, but I assume that would require re-defining or re-wiring quite a few dependencies (notably the low-level SQL management) that otherwise work well with the native entity storage.

Final Question

So what is the best practice way to make Drupal aware of externally managed schema updates that impact Drupal entities?

I have a feeling that fully understanding this also requires a good explanation of how APIs used to manage Entity/Field definitions/storage are independent (or not) from core schema management tools like the Schema API and hook_schema().

How to avoid saving a value of property of form object when saving changes to db

In a crud asp.net core 2.2 web app, I need to avoid saving a property of form object to db. How do I do that?

I’ve tried using [Editable(false)] data annotation on the ListBin property to prevent saving property value to db.

[Table("supply_lists")] public partial class SupplyLists {     [Column("id")]     public int Id { get; set; }     [Column("category_id")]     public int CategoryId { get; set; }     [Required]     [Column("coursecode")]     [StringLength(200)]     public string Coursecode { get; set; }     [Required]     [Column("title")]     [StringLength(200)]     public string Title { get; set; }     [Required]     [Column("filename")]     [StringLength(200)]     public string Filename { get; set; }     [Column("isactive")]     public bool Isactive { get; set; }     [Column("date", TypeName = "smalldatetime")]     public DateTime Date { get; set; }      [Column("list_bin")]     public byte[] ListBin { get; set; }      [ForeignKey("CategoryId")]     [InverseProperty("SupplyLists")]     public virtual SupplyListCategory Category { get; set; } }   [ModelMetadataType(typeof(MetaDataTypeModel))] public partial class SupplyLists {  } public class MetaDataTypeModel {     [Editable(false)]     public byte[] ListBin { get; set; }      [Display(Name = "Is Active")]     public bool Isactive { get; set; }      [Display(Name ="Course Code")]     public string Coursecode { get; set; }      [Display(Name = "Category")]     public int CategoryId { get; set; }      [DataType(DataType.Date)]     public DateTime Date { get; set; } }  public class EditModel : PageModel {     private readonly SupplyListCore22.Models.SupplyListsContext _context;     private readonly IHostingEnvironment _env;      public EditModel(SupplyListCore22.Models.SupplyListsContext context, IHostingEnvironment env)     {         _context = context;         _env = env;     }      [BindProperty]     public SupplyLists SupplyLists { get; set; }     [BindProperty]     public FileUpload FileUpload { get; set; }      public async Task<IActionResult> OnGetAsync(int? id)     {         if (id == null)         {             return NotFound();         }          SupplyLists = await _context.SupplyLists             .Include(s => s.Category).FirstOrDefaultAsync(m => m.Id == id);          if (SupplyLists == null)         {             return NotFound();         }        ViewData["CategoryId"] = new SelectList(_context.SupplyListCategory, "Id", "Category");         return Page();     }      public async Task<IActionResult> OnPostAsync()     {         //if (!ModelState.IsValid)         //{         //    return Page();         //}          _context.Attach(SupplyLists).State = EntityState.Modified;         await _context.SaveChangesAsync();          if (FileUpload.UploadSupplyList != null)         {             var fileUploadData = await utilities.utilities.ProcessFormFile(FileUpload.UploadSupplyList, ModelState);             if (ModelState.ErrorCount > 0)             {                 ViewData["CategoryId"] = new SelectList(_context.SupplyListCategory, "Id", "Category");                 return Page();             }             var sl = _context.SupplyLists.Find(SupplyLists.Id);             sl.ListBin = fileUploadData;              await _context.SaveChangesAsync();          }          return RedirectToPage("./Index");     } 

It set the ListBin to null in db which is not what I wanted when saving changes (I wanted to preserve the old value of ListBin in db).

calculating a shortest path in a table structure that changes in real time

I have a table that looks like this

enter image description here

In table NPC – are AI like characters that move from one point to another. Player – a character that is controlled by the user.

In any moment the player character might move to any cell of the table. But NPC has a particular goal cell that they should reach while moving. For example, blue NPC needs to read blue cell.

enter image description here

Other NPC are also moving. My goal is to write an algorithm that would allow NPC to reach the cell using the shortest path. As far as I understand it is a typical shortest path problem.

My question is – is there some algorithm that particularly optimized to handle such workflows, or any algorithms will suffice – just with recalculating the path after each move?

How to handle user permission changes in SPA?

I have a SPA which at the beginning of application startup calls the backend API, sends a JWT and asks for the access permissions of the current user. SPA then caches the permissions in-memory and uses them to check what the user is permitted to do and clears the cache when user takes an action after which the cache might become invalid (out of sync with the backend), like buying something which gives more permissions.

This worked great but now there is a new functionality which allows someone else (not the user which is currently browsing SPA) to grant (or remove) permissions for other users. This is not an issue for the backend because when a new permission is granted for user A, the backend state is immediately updated and subsequent request would check if user A has the required permissions to execute that action. The issue is that the SPA does not know that user A has a new set of permissions since the old ones are cached. That means that even though from the backend’s perspective the user is allowed to execute new actions, frontend thinks that the user does not have required permissions and will prevent user from doing that action and will show an error message. This mismatch between frontend and backend would persist until user reloads the page.

  • User A is browsing the SPA website User A asks, let’s say an admin, to grant him a permission to do X
  • Admin grants user A a permission to do X (now backend allows user A to do X) and informs user A of this
  • User A tries to do X but the frontend permission cache says that there is no permission for X and so an access denied error is shown

Any suggestions on how this can be handled? One idea is to have a list of users which need their frontend cache invalidated stored in the backend (which is a single machine). With every request, check if the user who is executing that request is in that list and if the answer is yes then “cancel” the request and return some HTTP status code which would indicate that SPA cache needs to be cleared to continue. SPA would know how to handle this response code and would clear the cache, reload the current page (or redirect to home page, or something else) and would repopulate the permission cache. This approach would seem to work but it feels somewhat hacky and complex so it would be great to hear some more insights.