CORS issue with WebApp on Azure + AAD

I would like to get some experts’ advice regarding the way I resolved the following issue:

Let’s say my personal app is hosted as an Azure App Service on The JS client code running withing the user’s browser does some GETs on the API served at

const myUrl = "" const myHeaders = {} const myInit = {     method: "GET",     headers: myHeaders }  fetch(myUrl, myInit) 

For his first connection on the web page, the user has to authenticate to the Azure Active Directory, then he can browser freely.


The fetch calls to the API get redirected to and the browser blocks the replies as per CORS.

Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at (Reason: CORS header ‘Access-Control-Allow-Origin’ missing) 

My solution:

I believe the fetch redirection happens within azure, before reaching my backend (no app logs were visible). I added the field credentials: “include” to myInit in order to pass azure authentication check. It works now, no redirection to another origin happens, the requests reach the backend and the replies reach the client 😀

const myUrl = "" const myHeaders = {} const myInit = {     method: "GET",     headers: myHeaders,     credentials: "include" }  fetch(myUrl, myInit) 

I am still beginner on Azure, and I got this working by chance. Do you think it is the right approach ?

Investigate potential breach in Azure App Service

I asked on Server-Fault, but have not received any response so thought I would try here.

We suspect we have had a data breach, but we are not sure how to investigate it to determine the source of the breach or what data was sent.

We have an app service that has been running for a while with steady usage. We noticed that over the last couple of nights there have been large spikes in data out. Our website has an authenticated user area and we are concerned that there may have been a breach or something unauthorized happening on the site.

The site has consistently always had below 10MB/15mins Data Out. But the sudden spike was over 180MB then instantly back down again. The second night the spike was 600MB. In the same 15 minute metric window the Average CPU time spiked to over one hour. Response time, number of requests and 4xx/5xx errors all remained steady.

Azure metrics graph

Is there a way using Azure (Metrics or Security Center) to determine what caused the massive spike in Data Out? What data was sent, who it was sent to etc? Is there anything we can enable within Azure to allow us to view this data if it was to happens again tonight? (e.g. Azure Sentinel)

Looking at other metrics, there was no obvious spike in 4XX or 5XX errors or number of requests, so we do not suspect a brute force or DoS attack.

Can one use a certificate directly from Microsoft Azure Key Vault for LDAP/S?

The only method I can seem to find to add a certificate for secure LDAP (LDAP/S) for Azure Active Directory Domain Services is to upload the certificate from my local computer. This seems like a very poor key management solution when Microsoft Azure Key Vaults is available for creating and storing key pairs and certificates. Am I missing something? Is there a way to directly use a certificate and key pair from a Key Vault or must I download these from a Key Vault and then upload them for LDAP/S? Best PKI practices dictate that I never access the private key directly.

How do I configure Azure Web Apps so that the only access is via CloudFlare?

I have a Web Apps (Linux) application on Azure, and I added a custom domain which I have protected with CloudFlare.

I added Azure Security Center to my subscription.

At the moment one can access the application either

  1. directly via or
  2. via which is protected by CloudFlare

How do I configure the Azure portal so that the only access to my web application is via CloudFlare?

One idea I had is to add an Azure Firewall, and set it to white list the CloudFlare IP Addresses, but I wondered if there is an easier way (and anyway I am not sure how to configure it)

Using Temp Tables in Azure Data Studio Notebooks

tl;dr I want to use temp tables across multiple cells in a Jupyter Notebook to save CPU time on our SQL Server instances.

I’m trying to modernize a bunch of the monitoring queries that I run daily as a DBA. We use a real monitoring tool for almost all of our server level stuff, but we’re a small shop, so monitoring the actual application logs falls on the DBA team as well (we’re trying to fix that). Currently we just have a pile of mostly undocumented stored procedures we run every morning, but I want something a little less arcane, so I am looking into Jupyter Notebooks in Azure SQL Data Studio.

One of our standard practices is take all of the logs from the past day and drop them into a temp table, filtering out all of the noise. After that we run a dozen or so aggregate queries on the filtered temp table to produce meaningful results. I want to do something like this:

Cell 1

Markdown description of the loading process, with details on available variables 

Cell 2

T SQL statements to populate temp table(s) 

Cell 3

Markdown description of next aggregate 

Cell 4

T SQL to produce aggregate 

The problem is that, it seems, each cell is run in an independent session, so the temp tables from cell 2 are all gone by the time I run any later cells (even if I use the “Run cells” button to run everything in order).

I could simply create staging tables in the user database and write my filtered logs there, but eventually I’d like to be able to pass off the notebooks to the dev teams and have them run the monitoring queries themselves. We don’t give write access on any prod reporting replicas, and it would not be feasible to create a separate schema which devs can write to (for several reasons, not the least of which being that I am nowhere near qualified to recreate tempdb in a user database).

Remote receivers debugging using azure service bus does not work

After i published app (not sure if related) to app o365 catalog (not store) i cannot debug app installed (or uninstalling) receivers using azure service bus anymore. I even tried removing remote endpoints from app manifest, but it did not help.

I also needed to increment app version in manifest to be able to event attempt debugging as app with same version was already in app catalog.

This is what is in output window

Successfully installed app for SharePoint. Services/AppEventReceiver.svc has been registered on Windows Azure Service Bus successfully. Services/AppEventReceiver.svc has been registered on Windows Azure Service Bus successfully. 

When i click on start, app is uninstalled (if it is previously installed), then installed, then VS gets out of debug mode, then there are messages that service bus has been successfully registered. Then internet explorer starts and VS goes back to debug mode (probably for debugging javascript in IE), but this is after app installed receiver finishes (successfully btw).

Enable debugging via Windows Azure Service Bus is checked in project settings. Connection string to Azure Service Bus is provided. I created Service using powershell so it does support necessary authentication methods. Related web project is also set up in app project properties. Debugging using service bus worked for me on this project before.

Any ideas?

Azure AD – Guest user SSO to RDP

We have two Azure subscription, (1) of the parent company where all the users live in Azure AD (synchronised from on premises AD), and (2) that holds the 365 apps which are dedicated to our company.

  • Is it possible out of the box (e.g. using Enterprise apps or by simply inviting Guest accounts) or SSO to allow users from (1) to SSO to RDP sessions (Windows Virtual Desktop / Terminal Services Server / Windows 10 machines etc.)?
  • Is it possible using a 3rd party solution or using certificates (similarly to SSH)? We have full control over the laptop builds (win 10 enterprise)

Sharepoint online/365 integration (Upload files) on React app hosted on Azure and WebApi C# .Net Core 2.2

I’m struggling since some weeks ago trying to interact/automate a way to upload files from a Web App created in React and upload files to a Sharepoint Online Site – in a specific folder. The WebApp is hosted in Azure and using a C# .Net Core 2.2 as backend.

I’m trying to using some kind of REST API that help me out with this task (Could be on React in frontend, or in C# Core or C# MS FW .Net for backend) I’m searching across internet a way to do it but all the testings were failed.

Someone can give me some insight, tip or advice on how to achieve this?

I’m trying:

  • Use code from Microsoft WebPage (Using jQuery).

  • Using PnP, but on my localhost I receive a CORS problem (I’m trying using Client ID and Secret ID to interact with Sharepoint).

Azure App unauthorised to download file from shared links in Sharepoint

I am currently doing an integration which requires me to download files from shared links.

Users can create share links to their files in SharePoint, and to download the file, I just need to add download=1 in the url parameters.

My program (Created from app registration) is able to download files that are set to public. However, if any restrictions are set, for example group/specific users, I will receive ERROR 401 Unauthorized even though I have already done Oauth2 authentication (I am able to call other REST APIs to view folder etc.).

I have a feeling that when the links are restricted to a certain group, my client isn’t recognised as the user I’ve login to. I tried elevating my client’s API permissions to both delegated and application but it still doesn’t work. Any help here?

Thanks in advance!