The HOME or COMPOSER_HOME environment variable must be set for composer to run correctly Error

When I try to complete the checkout by paying with Hosted Payment Form (mastercard module) I get an error 400 Bad request. I was reading the exception.log file and I found an error that says the following:

main.CRITICAL: The HOME or COMPOSER_HOME environment variable must be set for composer to run correctly {“exception”:”[object]

I was searching in internet and tried the following commands:

export COMPOSER_HOME=”$ HOME/.config/composer/” export CGR_BASE_DIR=”$ HOME/.config/composer/global” export CGR_BIN_DIR=”$ HOME/.config/composer/vendor/bin”

but none of this worked yet, i’m still getting the error. Hope someone can help me. Greetings!

Customized environment for new users in XFCE

I come back to the community because I have a sad problem.

I just migrated from Xubuntu 14.04 to Xubuntu 16.04. The environment was personalized. That is to say, every time a student (I am in a high school) connected, he always had the same environment, namely this one:

I was using the /etc/skel folder to do this by copying my .config and .gconf into the /etc/skel folder. It worked perfectly and provided a beautiful desktop.

Problem, I can’t reproduce this on my migrated workstation. As soon as a new user logs in, I get this:

And that gives me after that sad

I have already done this on Mate 16.04 and Ubuntu 18.04 but now I’m drying out. Problem with XFCE?

Thank you for your help!

centos: run a single site on startup(browser-GUI) – without Desktop environment

what i want is when i start my centos 7 server(minimal) automatically to show to user this site and user can just browse and not to do anything else.

So there will be no desktop just the youtube site user can surf

some suggestion i get was:

sudo xinit firefox --new-tab "" 

or nativefier to make youtube like app and launch but idk what should i install on cents to run this, (so i dont want desktop to be installed)

Best way to prevent duplicate payment of order in multi user environment?

I have an application in which users can create an order and payonline using stripe.js

When the order is created it’s status is set to unpaid. The user then has the option to review the order on a checkout page and make a payment on a payment page which has the form for them to enter their card details.

I use the order id as an idempotent key to ensure no duplicate payments are made e.g. if the users click the pay button multiple times on the payment page etc.

However since it’s a multiuser app i.e. other users from the same organization can view all orders to make payments. I want disable the pay button on checkout page for any orders that are currently being paid and display a ‘payment in progress’ message and redirect any other users trying to access the payment page for the same order if another user is already on the payment page.

if the first paying user navigates away from the payment page without making a payment then the pay button on the checkout can be enabled.

Any ideas how you would do this maybe using some kind of session

Software process as a manufacturing process and software process with Process-Sensitive Software Engineering Environment

As stated in the title, i have been searching the different views and differences between software process as manufacturing process with the software process in Process-sensitive software engineering for my research input. would you please help me on giving links or suggesting your opinion?

How to manage environment variables, parameters and secrets for local environments, CI and deployed services?

What do people use to manage environment variables, parameters and secrets? I have a full stack app using terraform. So I’m dealing with:

  • Building a shadow-cljs project, injecting env vars for auth, backend, google api keys,
  • Running a backend service that needs api keys, database credentials
  • Managing CI which need access to parameters and secrets for builds and deployment
  • Managing terraform which requires input variables for my terraform configuration and also outputs that are passed into other services (for example frontend and backend)
  • Developing locally where I need access to those variables, parameters and secrets in some capacity

Right now I’ve hacked together a combination of:

  • Terraform passing vars, parameters and secrets into services such as the backend
  • Defining env vars into my CI using their environment variable store
  • Pulling env vars from terraform outputs when needed for building and deploying.

Ideally I’d like to manage all this centrally some how. What have people use in the past for consolidating all this?

Should a developer be able to create a docker artifact from a lerna monorepo in their development environment?

I’ve recently started using lerna to manage a monorepo, and in development it works fine.

Lerna creates symlinks between my various packages, and so tools like ‘tsc –watch’ or nodemon work fine for detecting changes in the other packages.

But I’ve run into a problem with creating docker images in this environment.

Let’s say we have a project with this structure:

root   packages      common → artifact is a private npm package, this depends on utilities, something-specific      utilities → artifact is a public npm package      something-specific -> artifact is a public npm package      frontend → artifact is a docker image, depends on common      backend → artifact is a docker image, depends on common and utilities 

In this scenario, in development, everything is fine. I’m running some kind of live reload server and the symlinks work such that the dependencies are working.

Now let’s say I want to create a docker image from backend.

I’ll walk through some scenarios:

  1. I ADD package.json in my Dockerfile, and then run npm install.

    Doesn’t work, as the common and utilities packages are not published.

  2. I run my build command in backend, ADD /build and /node_modules in the docker file.

    Doesn’t work, as my built backend has require('common') and require('utilities') commands, these are in node_modules (symlinked), but Docker will just ignore these symlinked folders.

    Workaround: using cp --dereference to ‘unsymlink’ the node modules works. See this AskUbuntu question.

  3. Step 1, but before I build my docker image, I publish the npm packages.

    This works ok, but for someone who is checking out the code base, and making a modification to common or utilities, it’s not going to work, as they don’t have privledges to publish the npm package.

  4. I configure the build command of backend to not treat common or utilities as an external, and common to not treat something-specific as an external.

    I think first build something-specific, and then common, and then utilities, and then backend.

    This way, when the build is occuring, and using this technique with webpack, the bundle will include all of the code from something-specfic, common and utilities.

    But this is cumbersome to manage.

It seems like quite a simple problem I’m trying to solve here. The code that is currently working on my machine, I want to pull out and put into a docker container.

Remember the key thing we want to achieve here, is for someone to be able to check out the code base, modify any of the packages, and then build a docker image, all from their development environment.

Is there an obvious lerna technique that I’m missing here, or otherwise a devops frame of reference I can use to think about solving this problem?