PAGINAR RESPUESTA API EA AZURE (consumption.azure.com) – usagedetails -nextLink

Estoy trabajando con el API EA AZURE, donde se puede consultar el detalle de mi consumo que se realiza en Azure en rango de fechas. Actualmente estoy trabajando con C # donde realizo un pedido al API con la siguiente URL:

https://consumption.azure.com/v2/enrollments/xxxxxx/usagedetailsbycustomdate?startTime=2019-01-01&endTime=2019-02-01

Al consular el Api me trae un formato JSON , luego lo parseo a la clase RootObject con la siguiente estructura:

public class RootObject {     public string id { get; set; }     public List<Datum> data { get; set; }     public string nextLink { get; set; } }   

El valor nextLink es el enlace que me permite extraer la siguiente página de resultados, ya que al realizar la solicitud me trae 1000 valores como máximo. 1. ¿Cómo puedo obtener los valores de la siguiente pagina? Ya que cuando intento nuevamente realizar la obtención de los datos con la url que me devuelve en la primera solicitud, me sale error

Value nextLink : https://consumption.azure.com/v2/enrollments/xxxxxx/usagedetails/nextpage?sessiontoken=1:xxxxxxx&skiptoken=xxxx313030307C313030307C46616C73657C54xxxxC30&skiptokenver=v1&id=862095db-xxxxx-4bb3-9cf4-xxxx

 string token = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";  string RouteforDate =   "https://consumption.azure.com/v2/enrollments/xxxxx/usagedetailsbycustomdate? startTime=2019-01-01&endTime=2019-02-01";  HttpWebRequest request = (HttpWebRequest)WebRequest.Create(RouteforDate);  request.Headers.Add(HttpRequestHeader.Authorization, "Bearer " +  token);  request.ContentType = "application/json";  try         {             // Call the REST endpoint             Console.WriteLine("Calling Usage service...");             HttpWebResponse response = (HttpWebResponse)request.GetResponse();             Console.WriteLine(String.Format("Usage service response status:{0}", response.StatusDescription));             Stream receiveStream = response.GetResponseStream();              // Pipes the stream to a higher level stream reader with the     required encoding format.              StreamReader readStream = new StreamReader(receiveStream, Encoding.UTF8);             var usageResponse = readStream.ReadToEnd();             Console.WriteLine("Usage stream received.  Press ENTER to continue with raw output.");             //Console.ReadLine();             //Console.WriteLine(usageResponse);              //Console.WriteLine("Raw output complete.  Press ENTER to continue with JSON output.");             //Console.ReadLine();              // Convert the Stream to a strongly typed RateCardPayload object.               // You can also walk through this object to manipulate the individuals member objects.              var payload = JsonConvert.DeserializeObject<RootObject>(usageResponse);              var url = payload.nextLink;               ///////////////Get value url nextLink *****************              response.Close();             readStream.Close();             Console.WriteLine("JSON output complete.  Press ENTER to close.");             Console.ReadLine();          }         catch (Exception e)         {             Console.WriteLine(String.Format("{0} \n\n{1}", e.Message, e.InnerException != null ? e.InnerException.Message : ""));             Console.ReadLine();         } 

Lo tengo echo en código PHP : intento realizar lo mismo pero en C#

<script type="text/javascript">  'use strict' var BaseUrlTemplate = 'https://ea.azure.com' var Route =     'https://consumption.azure.com/v1/enrollments/xxxxxxxx/usagedetails'  var RouteforDate =   'https://consumption.azure.com/v2/enrollments/{xxxxxx}/usagedetailsbycustomdate' var header = 'bearer  xxxxxxxxxxxxxxxx[token]'  async function SearchDate() {  debugger; var date = new Date(); var dateString = new Date(date.getTime() - (date.getTimezoneOffset() * 60000 ))                 .toISOString()                 .split("T")[0]; axios.defaults.headers.common['Authorization'] = header; const response =  await axios.get(RouteforDate+"?startTime=2017-01-01&endTime="+dateString); var data=JSON.stringify(response.data);   var array =jQuery.parseJSON(data) var datos=array.data  var nextlink=array.nextLink  while(nextlink!=null){     const insertdata =  await axios.post("../DB/insertdata.php", { "data" : datos})     var nextResponse =  await axios.get(nextlink,{});     data=JSON.stringify(nextResponse.data);     array =jQuery.parseJSON(data)     datos=array.data     nextlink=array.nextLink     }      }  </script> 

Managing Azure Storage Account’s lifecycle via Terraform

I’m creating / managing Azure Storage Account via Terraform using azurerm_storage_account (https://www.terraform.io/docs/providers/azurerm/r/storage_account.html). As I don’t see it in data providers / modules list, I am wondering is it possible to manage Storage Account’s blob lifecycle policy (https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts) via Terraform to make the IaC experience complete?

Missing files after Azure file sync forced tiering

I ran a Invoke-StorageSyncCloudTiering on the server there we set up Azure File Sync, and most of the files is now missing. I get this result below. How do I get the files back? I cannot see it in the portal either now.

PS C:\windows\system32> Invoke-StorageSyncCloudTiering -Path E:\Shares\Users

TieredCount : 193 AlreadyTieredCount : 0 SkippedForSizeCount : 58846 SkippedForAgeCount : 0 SkippedForUnsupportedReparsePointCount : 0 FailedToTierCount : 186995 ReclaimedSpaceBytes : 180616517 SkippedForStableVersionFailure : 0

Azure Pipeline Builds does’t build all

An idea why Azure Pipeline Builds properly build my project but doesn’t create all my exe, dll in my “drop” artefact?

I have a solution composed by 10 projects.

enter image description here

My ‘Client’ project is a VB.NET classic old WinForm project. This one is not visible in my drop artifact:

enter image description here

My AutoUpdate project is also missing.

It seems the build system of doesn’t build my executable projects. What can I do? Is it normal? Is it limitation of something or wrong settings?

Azure DNS not propagating

I forwarded my domain to the Azure DNS nameservers:

ns1-01.azure-dns.com. ns2-01.azure-dns.net. ns3-01.azure-dns.org. ns4-01.azure-dns.info. 

That’s an NS record on the Azure side. The registrar is …not entirely important, as far as I know, but it’s Ionos.

I have two servers on different sides of the world. One is an Azure VM. The A records used to just point to that VM. I am now trying to redirect some of the records to my new server; e.g. @.mydomain.org, with CNAME www.mydomain.org.

This wasn’t my problem. I don’t have any inaccurate records. I do not know if it could be this because I’m not sure how to check the routers of the nameservers. The four nameservers above all return the address I want; 1.1.1.1 and 8.8.8.8 return the old IP. It’s been 24 hours since I changed these records and the original TTL was 12 hours. I’ve since set that to ten minutes (600s).

I have never had this happen before with DNS so I don’t have the foggiest idea of how to fix it. Any suggestions or advice would be appreciated. Happy to post more information if it is helpful.

How to apply Kubernetes cluster to an existing azure virtual mechines

I have an existing azure virtual machines that deployed 30 docker containers. So I have decided to use Kubernetes service/cluster to manage deploy dockers container on that existing azure virtual machines. I have also deploy azure registry to store docker images.

Is it possible way? Please help to give me your opinion?

Azure SQL Server – big query choosing bad execution plan

I’m using Azure SQL Server v12 and I have big, complex query that runs ok but when I add more joins, it doesn’t matter which table I’m joining to, it gets super slow. The important thing to point out is that by adding a simple join it’s choosing a different execution for the whole query and I receive warning on sort operations that are not performed in the original query.

I cannot post the query or the execution plan but I want to know if someone faced a similar case. My intuition tells me that SQL Server is choosing a bad execution plan because the query is so big that it times out when it tries to pick one.

Should I indicate the join type and index name for each join to keep the original execution plan? is that the best way to go?

Azure Files access from Ubuntu 18.04: Resource temporarily unavailable

From time to time a server (Ubuntu 18.04 LTS) fails running PHP scripts in cron with the following error output:

PHP Warning:  require_once(/path/to/include.php): failed to open stream: Resource temporarily unavailable in /path/to/script.php on line 2 PHP Fatal error:  require_once(): Failed opening required 'include.php' (include_path='.') in /path/to/script.php on line 2 

Script suddenly works again a few hours later. How can I solve this problem?

$   uname -a Linux hostname 4.15.0-47-generic #50-Ubuntu SMP Wed Mar 13 10:44:52 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux  $   dpkg -l|grep cifs ii  cifs-utils                            2:6.8-1                                    amd64        Common Internet File System utilities 

Unable to connect to CentOS VM on Azure

I created a virtual machine on Azure using CentOS 7. I’ve tried it using 2 different images. I suspect it has something to do with me having multiple ssh keys. I was able to ssh in by specifying the key, ssh -i ~/.ssh/id_rsa antarr@0.0.0.0. I copied the other key using ssh-copy-id -i ~/.ssh/other antarr@0.0.0.0 and I’m able to ssh in without specify which key to use now. But when trying to connect using RDP, I get a connection refused error. I have verified that the RDP, port is configured on the VM.