SQL Server Agent – Report Failure but continue When intermediate step fails

I have a SQL Server Agent job that has three steps with the following control flow:

  • Step 1 – on success – Go to next Step. on fail – job fails
  • Step 2 – on success – Go to next Step. on fail – Go to next Step
  • Step 3 – on success – report success, on fail – report fail

However, What I want to happen is, if step 2 fails, run step 3 but report that the job has failed (regardless of whether step 3 is successful or not)

The only way I can think to do this is as per the screenshot below which duplicates the final step but the duplicate step reports failure if it succeeds

enter image description here

Is there a better way of doing this?

Applying a function on several columns od a dataset fails on Missing

Consider a dataset with missing values:

ds={<|"timestamp" ->  DateObject[{2000, 1, 1, 1, 0, 0}, "Instant", "Gregorian", 2.],  "BASCH" -> 108., "BONAP" -> Missing["Unrecognized", "n/d"],  "PA18" -> 65.,  "VERS" -> 47.|>, <|"timestamp" ->  DateObject[{2000, 1, 1, 2, 0, 0}, "Instant", "Gregorian", 2.],  "BASCH" -> 104., "BONAP" -> 60., "PA18" -> 77.,  "VERS" -> 42.|>, <|"timestamp" ->   DateObject[{2000, 1, 1, 3, 0, 0}, "Instant", "Gregorian", 2.],  "BASCH" -> 97., "BONAP" -> 58., "PA18" -> 73.,  "VERS" -> 34.|>, <|"timestamp" ->  DateObject[{2000, 1, 1, 4, 0, 0}, "Instant", "Gregorian", 2.],  "BASCH" -> 77., "BONAP" -> 52., "PA18" -> 57.,  "VERS" -> 29.|>, <|"timestamp" ->  DateObject[{2000, 1, 1, 5, 0, 0}, "Instant", "Gregorian", 2.],  "BASCH" -> 79., "BONAP" -> 52., "PA18" -> 64., "VERS" -> 28.|>} 

I can get the mean of a given key easily, even with missing values:

no2[Mean, "BONAP"] (*64.0017*) 

But if I try to apply Mean to 2 columns, the Missing values become a problem:

no2[Mean, {"BONAP", "PA18"}] 

This returns a dataset with missing values. I suspect that this is not the right syntax, since in the first case the result is numeric, while the second operation returns a dataset. How does one apply a function to several columns?


This works:

no2[Mean, #] & /@ {"BASCH", "BONAP", "PA18", "VERS"} 

But is not what I’m looking for. I’m looking for a way to do it within the framework of the dataset.

linux-headers installation fails – unmet dependency – libcuda1

I am trying to install CUDA in Ubuntu 18.04.3 LTS according to this documentation from nvidia.

I ran into trouble when I tried to install linux headers by the following command.

sudo apt-get install linux-headers-$  (uname -r) 

It raised an Unmet dependencies error.

The following packages have unmet dependencies:  libcuinj64-9.1 : Depends: libcuda1 (>= 387.26) or                            libcuda-9.1-1 E: Unmet dependencies. Try 'apt --fix-broken install' with no packages (or specify a solution). 

When I tried

sudo apt --fix-broken install 

I got

dpkg: error processing archive /var/cache/apt/archives/libnvidia-compute-430_430.26-0ubuntu0.18.04.2_amd64.deb (--unpack):  trying to overwrite '/usr/lib/x86_64-linux-gnu/libnvidia-ml.so', which is also in package nvidia-340 340.107-0ubuntu0.18.04.3 Errors were encountered while processing:  /var/cache/apt/archives/libnvidia-compute-430_430.26-0ubuntu0.18.04.2_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) 

The whole terminal output can be seen here.

This is the gpu I have.

  *-display                         description: VGA compatible controller        product: GM204 [GeForce GTX 970]        vendor: NVIDIA Corporation        physical id: 0        bus info: pci@0000:01:00.0        version: a1        width: 64 bits        clock: 33MHz        capabilities: pm msi pciexpress vga_controller bus_master cap_list rom        configuration: driver=nouveau latency=0        resources: irq:29 memory:f6000000-f6ffffff memory:e0000000-efffffff memory:f0000000-f1ffffff ioport:e000(size=128) memory:c0000-dffff 

I am not sure where to go from here.

Blob Cache problem fails to update image renditions

We’re having problem with image renditions and blob cache. When we upload images to a site image renditions generate properly. But if we change crop of an image, that rendition does not update. We have to clear blob cache to fix this problem. This problem occurs sometimes even in 10 minutes after we clear blob cache. I don’t think clearing blob cache all the time is the solution. What might be the problem? Why blob cache is out of sync most of the times?

Sharepoint Foundation REST interface, updating an object fails

I’m using the REST interface detailed here -https://docs.microsoft.com/en-us/previous-versions/office/developer/sharepoint-2010/ff521587(v%3Doffice.14)

As a proof of concept, I’m trying to access an item in a list from our Sharepoint 2010 intranet site and update a field.

To do this, I created a Connected OData service to the endpoint (AccountingWorkflowsDataContext) and here is my code –

var tasks = new AccountingWorkflowsDataContext(new Uri(                     "https://mysite/_vti_bin/ListData.svc"));   tasks.Credentials = new NetworkCredential("username", "password", "domain"); var task = tasks.GLRecsTasks.Where(x => x.StatusValue != "Completed" && (x.AssignedToId == 1 || x.AssignedToId == 2)).First();  task.DueDate = DateTime.Now.AddDays(5); tasks.UpdateObject(task); tasks.SaveChanges(); 

I can connect to the list and I’m picking up the correct item and I’m tracking changes to it as well, however, the call to SaveChanges always fails with the message “An error occurred while processing this request.” and the innner exception is –

<?xml version="1.0" encoding="utf-8" standalone="yes"?> <error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">   <code></code>   <message xml:lang="en-US">An error occurred while processing this request.</message> </error> 

Can someone shed some light on this.

Error Cannot add or update a child row: a foreign key constraint fails

mi problema es que al crear una base de datos y en la tabla habitacion me surge un error y es el error 1452, he hecho la base varias veces viendo cual puede ser error, he cambiado cosas pero me sigue saliendo el mismo error al insertar datos en alguna tabla en este caso es la de habitacion al agregar datos, quien pueda ayudarme le agradezco su ayuda. Este es el codigo//

Create database hotel;   use hotel;    create table recepcionista(Clav_re varchar(10) not null,  Nombre_re varchar(20) not null,  A_pre varchar(30) not null,  A_mre varchar(30) not null,  telefono varchar(15) not null, primary key(Clav_re));     INSERT INTO recepcionista(Clav_re,Nombre_re,A_pre,A_mre,telefono) VALUES ('RT01','Juan','Osorio','Perez',914456435);    create table cliente(Id_cli varchar(10) not null, Clav_re varchar(10) not null, Nombre_cli varchar(25) not null, A_pcli varchar(25) not null, A_mcli varchar(25) not null, Telefono_cli varchar(15) not null, PRIMARY KEY(Id_cli), index(Clav_re), FOREIGN KEY(Clav_re) REFERENCES recepcionista(Clav_re) ON DELETE CASCADE ON UPDATE CASCADE );   INSERT INTO cliente(Id_cli,Clav_re,Nombre_cli,A_pcli,A_mcli,telefono_cli) VALUES ('CT02','RT01','Lalo','Perez','Perez',914456435);         create table servicio ( Id_ser varchar(10) not null, Tipo_ser varchar(25) not null, Fecha date not null, primary key(Id_ser));   INSERT INTO servicio(Id_ser,Tipo_ser,Fecha) VALUES ('ST03','servicio3','2000-10-22');    create table cliSer ( Id_cli varchar(10), Id_ser varchar(10), PRIMARY KEY(Id_cli,Id_ser), INDEX (Id_cli), INDEX(Id_ser), FOREIGN KEY(Id_cli) REFERENCES cliente(Id_cli) ON DELETE CASCADE ON UPDATE CASCADE, FOREIGN KEY(Id_ser) REFERENCES servicio(Id_ser) ON DELETE CASCADE ON UPDATE CASCADE );  INSERT INTO cliSer(Id_cli,Id_ser) VALUES ('CT02','ST03'); ('CT01','ST01'), ('CT01','ST02');    create table habitacion ( Id_hab varchar(10) not null, Id_cli varchar(10) not null, Num_hab int(10) not null, Num_piso int(10) not null, Dias int(10) not null, Precio int(10) not null, Tip_hab varchar(20) not null, PRIMARY KEY(Id_hab), index(Id_cli), FOREIGN KEY(Id_cli) REFERENCES cliente(Id_cli) ON DELETE CASCADE ON UPDATE CASCADE );    INSERT INTO habitacion(Id_hab,Id_cli,Num_hab,Num_piso,Dias,Precio,Tip_hab) VALUES ('H01','CT01',01,2,1,1200,'sencilla'); 

Ubuntu 19.04 Install fails with Installer crash

I’ve tried everything I can think of to solve this dilemma. Yesterday I had a Ubuntu 19.04 computer, it was however getting a bit bloated and some things (like multiple DNLA media servers) were getting a bit out of hand so I came up with a brilliant solution. First – I had a near-new 1Tb HDD sitting doing nothing, it’s about 3 weeks old Second – The current 2Tb hybrid drive tested OK using smart tools so I leave that installed. Third – powered down the computer, disconnected from power outlet, pressed the ‘on’ button a few times and opened the case added the 1Tb drive to a spare SATA channel and hooked it up the the PSU.

On another computer I downloaded a brand new ISO of Disco Dingo and used Rufus to create a boot drive

Hooked the target PC back up to power/monitor/KB and mouse and booted from USB.

Things seemed to be going swimmingly, along the same path as the initial install. – Remember this is going to be a clean install and all the data I need is on separate back-up solutions.

so Install Ubuntu is selected then Erase drive and use LVM, use 3rd part drivers and codecs, install updates during installation to save time then Install now then select location then a name for the PC on the network, plus a name and password for me is chosen the installer starts ticking over for what seems like hours (but is actually about 20 minutes. Until it comes to a complete halt, I have a window open on the monitor telling me that the CD is dirty or was created too fast, the target drive is failing But …… I’m not using a CD/DVD, one drive is brand spanking new and the other (according to S.M.A.R.T https://en.wikipedia.org/wiki/S.M.A.R.T.) is 360 days old with parameters within the norm.

So I’m stuck the installer then crashes, I send a report using the automated bug reporting tool.

And I start again, a new ISO (passes the SHA256 tests) different USB (in case the old one has some flaw)

First – boot from the USB so far so good, a window in the middle of the screen asks me to try Ubuntu or Install it, I’ve already tried it so I take the install option I only use English and it’s a English (US) keyboard -> CONTINUE Normal Installation/Download updates/Install 3rd party software for graphics wi-fi and media formats -> CONTINUE we have to wait a few minutes while the installer does it’s downloading finally, again, and again “The Installation Failed [Errno 5] Input/Output error This is often ue to a faulty CD/DVD disc or drive, or a faulty hard disk, it may help to clean the CD/DVD, to burn the CD/DVD at a lower speed. to clean the CD/DVD dive lens (cleaning kits are often available from electronics suppliers), to check whether the hard disk is old and in need of a replacement, or to move the system to a cooler environment”

So as the kids say – WTF !

there isn’t any CD/DVD issues in play and the oldest drive is less than a year old. It’s the middle of winter and it’s 14 degrees.

I suspect I may have to try a different installer or something

vmware player fails to install but no errors given

I am trying to install VMware player onto my workstation (Ubuntu 18.04.3 LTS). I’ve followed the instructions for installing from a bundle (download, chmod and sudo ./bundle_file) after which the installation process goes through a number of steps and finishes without any obvious errors. Despite there being no errors, closer scrutiny of the output seems to suggest that the installation did not actually install.

command ran as follows:

sudo ./VMware-Player-15.5.0-14665864.x86_64.bundle  Extracting VMware Installer...done. Installing VMware Player Setup 15.5.0 Copying files... Rolling back VMware Player Setup 15.5.0 Removing files... Deconfiguring... 

The attempted installation takes approx 9 seconds.

The complete installation log is as follows: /tmp/vmware-root/vmware-vmis-20775.log

2019-09-23T13:13:06.271+09:30| host-20775| I125: The process is 64-bit. 2019-09-23T13:13:06.271+09:30| host-20775| I125: Host codepage=UTF-8 encoding=UTF-8 2019-09-23T13:13:06.271+09:30| host-20775| I125: Host is Linux 4.15.0-64-generic Ubuntu 18.04.3 LTS 2019-09-23T13:13:06.270+09:30| host-20775| I125: DictionaryLoad: Cannot open file "/usr/lib/vmware/settings": No such file or directory. 2019-09-23T13:13:06.270+09:30| host-20775| I125: [msg.dictionary.load.openFailed] Cannot open file "/usr/lib/vmware/settings": No such file or directory. 2019-09-23T13:13:06.270+09:30| host-20775| I125: PREF Optional preferences file not found at /usr/lib/vmware/settings. Using default values. 

No vmplayer command is created anywhere that I can find.

I’ve also tried downloading and running the full workstation bundle (VMware-Workstation-Full-15.5.0-14665864.x86_64.bundle). Output of the run attempt is nearly identical (only the name changes).

I’ve spent many hours crawling through Q&A forums and cannot find a similar case. Anyone know how to fix this? I need the vmplayer working asap.

Openstack deployment with juju fails connecting to mysql

I’m deploying OpenStack using juju. All instances comes up, but it seems that no instances are capable of communicating with the mysql server. Juju status shows:

hook failed: “shared-db-relation-changed”

for neutron-api, glance and nova-cloud-controller. If I look in juju debug-log, I get messages like

“Host \’\’ is not allowed to connect to this MySQL server”

Users nova, glance, neutron and keystone have been created by the juju charm and can be listed in mysql. I have looked in /etc/mysql/percona-xtradb-cluster.conf.d/mysqld.cnf and found that bind-address is set to What can be wrong ?

PlayOnLinux+ReaderDC fails on system-encrypted Ubuntu 16.04

I’m using Ubuntu 16.04.6 with full-system encryption. I installed wine:i386, then PlayOnLinux (“POL”), then Adobe Acrobat Reader DC (via the downloaded Adobe installer, not whatever is built into POL).

POL launches without visible incident. DC launches from within POL without visible incident.

When I try to open a PDF document, the DC windows greys over as if stuck waiting for resources. It stays that way (20-minute wait before giving up).

It occurred to me that full-system encryption might interfere with how WINE functions, but in looking around for similar problem reports I haven’t found anything.

Any ideas?

[ This is a newly set up Ubuntu; no work is at risk yet. I’m working on how to get various tools paralleled from Windows, and DC is a crucial one. Encryption is not optional. ]