Quantcast
Channel: MSDN Blogs
Viewing all 5308 articles
Browse latest View live

Quantum Development Kit 0.3 Release

$
0
0

Here is the 0.3 release of the Quantum Development Kit!! Main
features:

·       New
quantum chemistry library
.
The
new quantum chemistry library can be used to develop quantum simulation
solutions in the chemistry domain. We worked closely with chemistry experts at
Pacific Northwest National Labs (based in Richland, WA) to bring real-world
chemistry to Q#.

·       Improved
Q# developer experience
.

The Quantum Development Kit now delivers deeper integration with Visual Studio
and Visual Studio Code. This update includes live feedback as you type, with
errors indicated by underlined errors and warnings. Greatly increases the user
experience by providing “code hovering”, “F12 – go to definition” etc. for VS
and VS Code.

·       New Q#
language capabilities
.

The Q# programming language has been extended in ways that unify how developers
code common operations, such as iteration over arrays, making coding in Q#
easier and faster.


Code here:

https://github.com/Microsoft/QuantumLibraries/

https://github.com/Microsoft/Quantum

 

Docs here:

https://docs.microsoft.com/en-us/quantum/

https://docs.microsoft.com/en-us/quantum/libraries/chemistry


Eine virtuelle Reise durch die Welt – der 48-Stunden-Skype-a-Thon

$
0
0

Ein Abenteuer für die ganze Klasse. Übers Netz besuchen Schülerinnen und Schüler den Rest der Welt – ob beim Mystery-Skype mit einer unbekannten Klasse oder im Expertengespräch mit einem Wissenschaftler in der Antarktis. Skype-in-the-classroom überwindet kulturelle, sprachliche und geografische Grenzen. Einmal im Jahr werden die zurückgelegten virtuellen Meilen im Rahmen des 48-Stunden-Skype-a-Thons gezählt und kommen Kindern in Form von Bildungsressourcen zugute. Auch die Klasse von Jan-Martin Klinge war dabei. Beim Mystery-Skype trafen die Schülerinnen und Schüler eine portugiesische Klasse.

Grenzen überwinden: Vom Klassenzimmer raus in die Welt

Jedes Jahr tauschen sich beim 48-Stunden-Skype-A-Thon eine halbe Million Lernende und Lehrende auf der ganzen Welt aus – via Skype und direkt aus dem Klassenzimmer. Die Skype-in-the-Classroom-Community hält dafür einige Möglichkeiten bereit, um sich mit anderen Schülerinnen und Schülern, Expertinnen und Experten, aber auch Gastrednern oder Wissenschaftlern zu vernetzen. Im Rahmen des 48-Stunden-Skype-a-Thons werden die an diesen zwei Tagen zurückgelegten virtuellen Meilen gezählt. Über die Organisation WE stellt Microsoft für jeweils 400 erreichte virtuelle Meilen Bildungsressourcen für ein Kind in Not zur Verfügung. Ziel des Skype-a-Thons ist es, nicht nur das Bewusstsein der Lernenden für verschiedene Kulturen zu fördern, sondern darüber hinaus auch Schülerinnen und Schüler auf der ganzen Welt zu unterstützen.

Mystery-Skype – ein Dorflehrer zeigt, wie es geht

Wie soll ich als kleiner Dorflehrer mit der Welt in Kontakt treten? Das fragte sich auch Jan-Martin Klinge, Lehrer an einer Microsoft-Showcase-Schule und MIE Expert. Über die Community fand er eine portugiesische Klasse, die ebenfalls am 48-Stunden-Skype-A-Thon teilnehmen wollte. In einer Mystery-Skype-Session sollten sich die Klassen nun kennenlernen

„Die Kinder wussten nicht, wer die andere Klasse ist oder aus welchem Land die Kinder kommen.“
Jan-Martin Klinge, Betreuer der Klasse und MIE Expert

Voller Aufregung trugen die Schülerinnen und Schüler nun gemeinsam mit ihrem Lehrer Fragen zusammen, um dieses Rätsel auflösen zu können.

„Is your country next to an ocean?“

„Do you have to pay for school?

Darüber hinaus war es dem MIE Expert wichtig, den Kindern klar zu machen, dass Sie nicht nur ihre Schule, sondern auch Deutschland repräsentieren würden. Respekt, Anstand, Höflichkeit und auch Ernsthaftigkeit während des Gesprächs sind deshalb grundsätzliche Werte, die es im Vorfeld zu vermitteln galt.

Motivation und Entdeckergeist: Die Kinder sind begeistert

Neugierig fragten sich die Kinder gegenseitig aus und stellten schnell fest, dass Schülerinnen und Schüler auch hunderte von Meilen entfernt doch ähnliche Wünsche und Bedürfnisse haben. Auch der Umgang mit der englischen Sprache bekam für die Kinder einen klaren Nutzen: Um die Welt zu entdecken, Menschen und Kulturen kennenzulernen, braucht es Vokabeln – und somit auch den Englischunterricht.

„Man lernt Englisch, weil man halt den Englischlehrer mag, weil es im Englischbuch steht oder weil man Klassenarbeiten schreibt, so ein richtiger Nutzen ist für die Kinder oft in weiter Ferne.“
Jan-Martin Klinge

Unterrichtsinhalte mit der Welt, mit Menschen und Geschichten verbinden – das motiviert, lässt neue Ideen entstehen und weckt den Entdeckergeist. Jan-Martin Klinges Schülerinnen und Schüler sind beeindruckt.

Mehr Skype im Klassenzimmer

„Die Kinder haben sich gewünscht, dass wir den Mystery-Skype wiederholen und uns mit anderen Klassen unterhalten.“
Jan-Martin Klinge

Auch für den Lehrer war die Mystery-Skype-Session ein voller Erfolg. Besonders, weil der Skype-Call gezeigt hat, wie einfach es im digitalen Zeitalter sein kann, mit anderen Schulen, Ländern und Kulturen in Kontakt zu treten. Eine Partnerschule zu finden, ein Treffen zu organisieren und auf konventionellem Wege eine Schulpartnerschaft zu führen, wäre weitaus aufwendiger gewesen, berichtet Jan-Martin Klinge. Deshalb möchte er nun auch andere Kolleginnen und Kollegen davon überzeugen, dass sich Skype-in-the-Classroom richtig lohnt.

Mehr davon: Hier entstehen virtuelle Schulpartnerschaften

Jan-Martin Klinge hat Großes vor: Seine Vision ist es, Unterricht gemeinsam mit anderen Ländern und Kulturen durchzuführen, anstatt sie nur theoretisch zu behandeln. Über den Skype-a-Thon hinaus sucht der MIE Expert nach weiteren Partnerschulen, um Skype-in-the-Classroom langfristig zu einem festen Bestandteil des Schulalltags zu machen. So könne man sich mit Schulen auf der ganzen Welt fortlaufend austauschen.

„Haben wir im Englischunterricht beispielsweise das Thema Südafrika und haben dazu auch eine passende Schule aus Südafrika, können wir am Ende einer Einheit anrufen und Fragen stellen.“Jan-Martin Klinge

Der internationale Austausch schafft sinnstiftende Lernanlässe und bietet den Schülerinnen und Schülern greifbare Ziele. Und darüber hinaus haben die Kinder auch noch richtig Spaß.

Enforcing TLS 1.2 use with Azure Service Bus

$
0
0

Long-term commitment to supporting protocols and protocol versions is a key expectation customers have from commercial cloud services. Deployed solutions, and especially widely deployed client applications that are often operated outside the original developer’s control, are expected to not become spontaneously broken and cut off from cloud services by protocols being retired. 

When it comes to long-term support, security protocols such as Transport Layer Security (TLS, colloquially also still referred to as SSL) are the toughest to manage, because information security professionals rightfully push to expediently upgrade to the latest versions at the earliest time – with information security policies reflecting this – while there are significant client deployments “out there” which cannot be easily upgraded to that latest version, often also due to platform limitations. “Just change the code to the latest runtime and recompile” is often not a practical option for a variety of reasons. 

Azure Service Bus is one of the oldest services in Azure, with some of its existing protocol surface area in active use by customers dating back to even before commercial availability in 2010, and there are quite a few applications that have been built and deployed years ago using TLS 1.0 and are considered by ISVs and/or their customers to “just work”, even if we here at Microsoft disagree from a security policy perspective. TLS 1.0 is problematic, but for some customers or their clients not problematic enough to rush to retire all uses of it in older systems. That’s why we still offer TLS 1.0 and TLS 1.1 as an endpoint option. 

That all said: If application deployments under your control or watch are still relying on TLS 1.0 or TLS 1.1, those are on borrowed time. The clock is ticking loudly and many customers who are known to have such dependencies have already received or will receive communication from Azure to that effect; the Azure platform will retire TLS 1.0 and TLS 1.1 as a matter of global policy all at the same time for services that still support it. 

Even while TLS 1.0 and TLS 1.1 remain an option on the Service Bus gateways, your own applications can ensure to be in full compliance with current policies and always use TLS 1.2. The TLS protocol version and the TLS cipher suites are ultimately always a client choice to make, and the client can always refuse to communicate further if the offered capabilities are outside of its desired compliance framework.  

Enforcing use of TLS 1.2 with in-support clients 

If your Service Bus clients are up to date, you are generally using TLS 1.2. 

If you are using any version of the official .NET Standard client (Microsoft.Azure.ServiceBus on Nuget) or version 3.4.3 or later of the .NET Framework client (WindowsAzure.ServiceBus on Nuget) and you are using the .NET Framework 4.7.1 or newer, your application will automatically follow the .NET Framework guidance model for TLS and always follow the OS configuration settings and respective .NET Framework overrides. Current versions of Windows default to using TLS 1.2. 

For .NET Framework 4.6, you will have to enforce the use of TLS 1.2 in the startup of your application by setting  

        ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 

or by enforcing the use of HTTPS tunnelling (see further below). 

If you are using the official Java client (azure-servicebus on Maven), the client will automatically follow the Java platform rules for enforcing TLS 1.2, where TLS 1.2 is the default. 

If you are using a custom AMQP 1.0 client, refer to the respective project documentation for how to enforce TLS 1.2 use. The Microsoft lower-level AMQP libraries for .NET (Microsoft.Azure.Amqp and AmqpNetLite) follow the .NET Framework guidance and default to TLS 1.2. The Apache Qpid Proton AMQP libraries generally default to TLS 1.2 today. 

Enforcing use of TLS 1.2 with out-of-support clients 

The only scenarios where enforcing use of TLS 1.2 is substantially trickier involve using out-of-date clients where it’s impossible to follow the .NET Framework guidance to upgrade to .NET 4.7.2. 

If you are using the .NET Framework client (WindowsAzure.ServiceBus on Nuget) with version 4.5.x of the .NET Framework and you are using the AMQP transport, you must use version 3.4.3 of that client or later to enforce the use of TLS 1.2. 

If you are instead using the (default) NetMessaging transport, which is based on WCF, the .NET Framework versions 4.5.x and earlier have TLS version 1.0 hardcoded for the native WCF transport option, but for Service Bus you can circumvent this when you enforce HTTPS tunnelling, which follows the operating system default rules.  

If you don’t have source code access or can’t change your deployment, and the client does not explicitly override the ServiceBusEnvironment.SystemConnectivity.Mode setting to ConnectivityMode.Tcp, you can simply enforce HTTPS tunnel usage by suppressing outbound connectivity on ports 9350-9354 with a local firewall.  

You can enforce HTTPS usage by setting 

        ServiceBusEnvironment.SystemConnectivity.Mode=ConnectivityMode.Https 

Supported Cipher Suites 

A key concern with older implementations of TLS is that they prefer outdated cipher suites. The Service Bus TLS configuration only offers the following options, in order of service-side preference 

The last three options are only temporarily retained for compatibility with some of the oldest clients. 

  • TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 (0xc030), ECDH x25519 (eq. 3072 bits RSA) 
  • TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 (0xc02f), ECDH x25519 (eq. 3072 bits RSA) 
  • TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 (0xc028), ECDH x25519 (eq. 3072 bits RSA) 
  • TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 (0xc027), ECDH x25519 (eq. 3072 bits RSA) 
  • TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA (0xc014), ECDH x25519 (eq. 3072 bits RSA) 
  • TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA (0xc013), ECDH x25519 (eq. 3072 bits RSA) 
  • TLS_RSA_WITH_AES_256_GCM_SHA384 (0x9d) 
  • TLS_RSA_WITH_AES_128_GCM_SHA256 (0x9c) 
  • TLS_RSA_WITH_AES_256_CBC_SHA256 (0x3d) 
  • TLS_RSA_WITH_AES_128_CBC_SHA256 (0x3c) 
  • TLS_RSA_WITH_AES_256_CBC_SHA (0x35) 
  • TLS_RSA_WITH_AES_128_CBC_SHA (0x2f) 
  • TLS_RSA_WITH_3DES_EDE_CBC_SHA (0xa) 

 

Summary 

Please verify that your applications are configured to use TLS 1.2. If they do, and this is purely a client choice, your applications will be in compliance with all policies that require TLS 1.2.  

While TLS 1.0 and TLS 1.1 is still an endpoint option on Service Bus, you should consider it urgent to move existing applications using out-of-support clients to TLS 1.2, because its foreseeable that these versions will be retired from use across all Azure cloud services.  

How can I make a dialog box right-to-left at runtime?

$
0
0


A customer had a program that decided at runtime
what language to use for its user interface.
The customer was having trouble getting the program's
dialog boxes to show up with right-to-left layout
when necessary.




We are setting the
WS_EX_LAYOUT­RTL
extended style in the dialog box's
WM_INIT­DIALOG message handler,
but
what we're seeing is
that although the style applies successfully,
and the dialog itself renders RTL,
its child elements don't.












×
emaN a esoohC


Name:

John Smith



OK





The actual screen shot provided by the customer
involved Arabic text, but I used English text so you
can see which parts are running left-to-right and
which parts are right-to-left.
Also, because I don't know Arabic.



The title bar is running right-to-left, as desired,
but the contents of the dialog are still left-to-right.
What's going on?



As I've noted before,

the
WS_EX_LAYOUT­RTL
extended style
is inherited by child windows

(unless blocked by the
WS_EX_NO­INHERIT­LAYOUT
extended style).
What I didn't call out is that this inheritance occurs at the
point the child window is created.
The child window takes a snapshot of its parent window's
layout;
future changes to the parent window's layout have no effect.



The next piece of the puzzle is realizing that the
WM_INIT­DIALOG message
is sent after the child windows have been created.

I called this out explicitly in an earlier discussion of dialog boxes
,
but you already know this,
because your
WM_INIT­DIALOG message handler
calls
Get­Dlg­Item to obtain handles to dialog
child windows in order to initialize and configure them.



So you need to get the
WS_EX_LAYOUT­RTL
onto the window before child windows are created.
One way of doing this is to edit the dialog template
and add (or remove) the
WS_EX_LAYOUT­RTL
extended style
from the

dwExStyle member of the 32-bit extended header

before you call
Create­Dialog­Indirect or one of its relatives.

Failing to install a new instance of SQL server 2014/2016

$
0
0

Yesterday, I was working with a client of mine on an issue with configuring SQL server reporting services over a Failover Cluster Instance. Due to issues, we had to uninstall the currently existing instance on the passive node and had to install it freshly.

Before the new install attempt, we cleaned the registry entries and uninstalled the various SQL components using the below command :

WMIC PRODUCT Where "Caption like '%SQL%'" GET Caption, IdentifyingNumber 

This command would list the GUIDs and using which we can uninstall the components using the command :

msiexec /X {guidnumber}

Post which I tried for a new installation of the SQL server instance, it went through all the stages, generated the configuration.ini file as well but failed at the final stage of installation with the below pop up error message :

TITLE: Microsoft SQL Server 2016 Setup
------------------------------
The following error has occurred:
Updating permission setting for file 'G:System Volume InformationWPSettings.dat' failed. The file permission setting were supposed to be set to 'D:P(A;OICI;FA;;;BA)(A;OICI;FA;;;SY)(A;OICI;FA;;;CO)(A;OICI;FA;;;S-1-5-80-3880718306-3832830129-1677859214-2598158968-1052248003)'.
Click 'Retry' to retry the failed action, or click 'Cancel' to cancel this action and continue setup.
For help, click: http://go.microsoft.com/fwlink?LinkID=20476&ProdName=Microsoft%20SQL%20Server&EvtSrc=setup.rll&EvtID=50000&ProdVer=13.0.5201.2&EvtType=0x88792597%25400xBB814387
------------------------------
BUTTONS:
&Retry
Cancel
------------------------------

When we found the contents of the G: drive , it was used for the tempDB drive and thus cleared the previous contents of the tempDB drive and post which we retried the installation and things went smooth. A new instance was installed.

Post which using the scale out deployment feature https://docs.microsoft.com/en-us/sql/reporting-services/install-windows/configure-a-native-mode-report-server-scale-out-deployment?view=sql-server-2017  we configured SSRS over the failover cluster instance and we thus resolved the issue.

 

Hope this helps !! Happy installing !!

Run your CI with Jenkins and CD with Azure DevOps

$
0
0

Azure release pipelines provide you with the first-class experience to integrate with Jenkins. You can have Jenkins as your Continuous Integration (CI) system and use Azure DevOps Release for your Continuous Deployment (CD) and get all the benefits of Azure DevOps like:

  • End to end traceability for your CI/CD workflow.
  • Track you commits and work-items.
  • Enable manual approvals and deployment gates on your pipeline.
  • Deploy to various services (Azure) via Azure pipelines.

In this example you will build a Java web app using Jenkins and deploy to Azure Linux VM using DevOps Azure Pipelines.

Ensure the repo where your code is hosted (Github, GHE or Gitlab) is linked with your Jenkins project. Also, please ensure your JIRA server plugin is installed on Jenkins so that you can track your JIRA issues.

Now you can configure Jenkins with Azure DevOps to run your Continuous Deployment workflows.

  1. Install Team Foundation Server plugin on Jenkins.
  2. Now inside your Jenkins project, you will find a new post build action “Trigger release in TFS/Team Services”. Add the action into your project.
  3. Provide the collection url as – https://<accountname>.visualstudio.com/DefaultCollection/
  4. Create a credential(username/password) for Azure Devops with PAT as password. Leave username as empty. Pass this credential to the action.
  5. Now you can select the Azure DevOps project and Release definition from the dropdowns. Choose the Release Definition you want to trigger upon completion on this Jenkins job.

Now a new release (CD) gets triggered every time your Jenkins CI job is completed.

 

However, to consume Jenkins job, you need to define Jenkins as artifact source in Azure DevOps.

  1. Go to your Azure DevOps project and create a new service connection for Jenkins.
  2. Now, go to your Release pipeline and add a new Jenkins artifact source and select the Jenkins job you want to consume as part of your pipeline.
  3. Every time you run your pipeline, artifacts from your Jenkins job are downloaded automatically and made available for you. Also, all the associated commits and JIRA issues are also extracted so that you can compare between release deployments and get the full traceability of your workflow.

That’s it and now you have a complete DevOps workflow with Jenkins as CI and Azure DevOps as CD and you can get full traceability of your workflow.

You can take advantage of Azure DevOps release features like approvals and deployment gates, integrate with Service Now and deploy to Azure, AWS or deploy to Linux VMs using Ansible and many more.

Get started with Dynamics 365 Service Scheduling

$
0
0

The new scheduling experience in Customer Service Hub is now live. It is powered by Universal Resource Scheduling and provides an efficient way to schedule resources for service activity. It considers the availability of employees, facilities, and, equipment to plan schedules accordingly. It also helps customer service organizations improve service quality by preventing over-scheduling.

If you use Dynamics 365 for Customer Service Enterprise, Customer Engagement Plan, or Dynamics 365 Plan licenses, get started with the new service scheduling by following the steps below:

To enable the functionality in your org,

  • Sign-up to enable service scheduling on your org.
  • Some pre-requisites are:
    • The org must be new or reset, Dynamics 365 for Customer Engagement 9.0.2.1080 or later.
    • The org should have Customer Service Hub, version 9.0.5.56 or later.
    • The org should have Universal Resource Scheduling package, either version higher than 3.2.0.405 or the org should have no Universal Resource Scheduling package installed.

Note: Once the new service scheduling is installed, your access to legacy service scheduling will be cut-off in the current organization. We recommend you to first opt-in your sandbox instance and test all your use cases, and then go for production instance.

Note: The new service scheduling only works as a Unified Client application. To access the new functionality, go to Customer Service Hub, expand the sitemap and then navigate to Scheduling.

Experience the new Service Scheduling

The new service scheduling functionality brings in the URS entities - such as Resources, Resource Categories, Fulfillment Preferences, Organizational Units etc.  – and retains legacy service scheduling entities such as Service, Service Activity and Facilities/Equipment.

Set up resources

Set up resources of type Facilities or Equipment so that they can be consumed within scheduling. Once you set up a resources record, you can define its capacity and workhours, which helps in scheduling a service.

 

Set up a resource

 

Monthly calendar

 

Set up resource categories

Set up resource categories to include resources under various groups into categories.

This approach allows admins to define service criteria flexibly without having to hardcode a resource in the criteria.

 

Set up resource categories

 

In the below example, while scheduling a Service Activity of the type ‘Ad-hoc bike service’, schedule board would show option to include 1 repair center and 1 workbench

 

Resource Requirements tab on Service record

 

Set up services

Set up services to define the resource criteria that needs to be applied for scheduling a service activity.

Create and schedule a service activity

Create a service activity by choosing a predefined service. As you select ‘Book’, schedule board is launched to show the list of resource options that are applicable for the given requirement.

 

Create and schedule a service activity

 

The schedule board lets you use ‘Schedule Assistant Filter’ to narrow-down results to suit a specific customer need – based on time slot, organization unit, required resource etc.

 

Schedule Assistant filter

 

Learn more about Scheduling

 

We are eager to see how the new scheduling experience helps you transform your scheduling experience, and look forward to your adoption and feedback!

 

Thanks

Vamsi Korlepara, Shubham Jain

Dynamics 365 for Customer Service

 

Experiencing Data Access issue in Azure Portal for Many Data Types – 11/22 – Investigating

$
0
0

Final Update: Thursday, 22 November 2018 16:59 UTC

We've confirmed that all systems are back to normal with no customer impact as of 11/22, 15:50 UTC. Our logs show the incident started on 11/22, 15:10 UTC and that during the 40 min that it took to resolve the issue Approximately 7% of customers in West Europe might have experienced data access issues in Azure Portal for Application Insights.
  • Root Cause: The failure was due to issues with one of the dependent service.
  • Incident Timeline: 40 minutes - 11/22, 15:10 UTC through 11/22, 15:50 UTC

We understand that customers rely on Application Insights as a critical service and apologize for any impact this incident caused.

-Venkat




Initial Update: Thursday, 22 November 2018 15:59 UTC

We are aware of issues within Application Insights and are actively investigating. Some customers may experience Data Access issue.
  • Work Around: none
  • Next Update: Before 11/22 18:00 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Anmol





Happy Thanksgiving wishes and thanks to you all

$
0
0

image As we celebrate another Thanksgiving Day here in the U.S., I wanted to take a quick moment to once again express my thanks and gratitude to all of the wonderful people I have the honor and blessing to get to interact, associate, and work with each and every day here in my role at Microsoft. From the partners stretching across all corners of the world, to my counterparts located right here at our headquarters here in Redmond/Bellevue and across the US.

 

While my current role as Director of the US Business Intelligence and Analytics team here at Microsoft has me focused heavily on supporting our internal leadership and teams versus several of my former roles, which were highly externally focused, such as leading our US SMB Community, leading our US Partner Community engagement, leading our Worldwide Partner satisfaction and community efforts, etc., I still enjoy the opportunities I get to interact with our partners from around the world whenever I can, as well as the engagement with my counterparts here at Microsoft from around the world.

 

Thank you to you all for the partnership, collaboration, engagement, and opportunities to work with you all throughout these years. Here’s wishing you all a very Happy Thanksgiving day with your friends, family, and loved ones, wherever you are.

 

Best wishes to you all,

 

Eric Ligman

Director, Business Intelligence & Analytics

Microsoft Corporation

This posting is provided "AS IS" with no warranties, and confers no rights

The default number of threads in an I/O completion port is the number of processors, but is that logical or physical processors?

$
0
0


The

Create­Io­Completion­Port

function lets you specify how many concurrent threads can be
processing work from the completion port.
But if you pass a value of 0 for
Number­Of­Concurrent­Threads,
"the system allows as many concurrently running threads
as there are processors in the system."



Are these physical processors or logical processors?



They are logical processors.



One way to figure this out is that
the goal of the I/O completion port is to keep CPU usage at
exactly 100%.
If the I/O completion port consumed only as many threads as
physical processors,
then you wouldn't quite get to 100% CPU utilization, because there
would be extra capacity on the unused logical processors.



Another way to figure this out is to use your understanding
of history.
I/O completion ports were created long before hyperthreading
was invented,
so this code

treated all logical processors as full-fledged processors
.



And a fourth way to figure it out is to test it.



#include <windows.h>
#include <strsafe.h>

#define THREADS 10

LONG ActiveThreads = 0;

DWORD CALLBACK IoThread(void* Port)
{
DWORD Bytes;
ULONG_PTR Key;
OVERLAPPED* Overlapped;
while (GetQueuedCompletionStatus(Port, &Bytes,
&Key, &Overlapped, 1000)) {
TCHAR msg[64];
auto count = InterlockedIncrement(&ActiveThreads);
StringCchPrintf(msg, ARRAYSIZE(msg), TEXT(">%drn"), count);
OutputDebugString(msg);

DWORD Tick = GetTickCount();
while (GetTickCount() - Tick < 1000) { }

count = InterlockedDecrement(&ActiveThreads);
StringCchPrintf(msg, ARRAYSIZE(msg), TEXT("<%drn"), count);
OutputDebugString(msg);
}
return 0;
}

int __cdecl main(int, char**)
{
HANDLE Port = CreateIoCompletionPort(INVALID_HANDLE_VALUE,
nullptr, 0, 0);

HANDLE ThreadHandles[THREADS];
int i;
for (i = 0; i < THREADS; i++) {
DWORD Id;
ThreadHandles[i] = CreateThread(0, 0, IoThread, Port, 0, &Id);
}

for (i = 0; i < THREADS * 2; i++) {
PostQueuedCompletionStatus(Port, 0, 0, nullptr);
}

for (i = 0; i < THREADS; i++) {
WaitForSingleObject(ThreadHandles[i], INFINITE);
}

return 0;
}



Pick a value for THREADS that is greater than the
number of logical processors.



We start by creating an I/O completion port and a bunch of threads
whose job it is to complete work posted to the port.
we then post a lot of work to the port and wait for the threads
to drain the work.



Each thread grabs a work item,
then increments a counter that lets us know how many threads
are actively processing work.
The thread then goes into a tight spin loop for one second.
It has to do this rather than Sleep because the
thread needs to be actively doing work for it to be counted
against the I/O completion port's concurrency limit.



After wasting some time, the thread decrements the count of
active threads,
and then goes back to looking for more work.



Along the way, we print the number of active threads.



Run this program, and you'll see that it retires work in chunks,
and the number of threads in each chunk is the number of
logical processors.



So there, confirmed by experimentation.

If activation of your gift card lasts more than four hours, consult a physician immediately

$
0
0


I received a gift card.
The instructions which accompany the gift card include the following:



To activate, sign the back of the card.
In some cases, activation may take up to four hours.


What, you think I write that slowly?



Bonus chatter:
Yes, I know that the reason is that
it may take up to four hours for the purchase to propagate through the
system,
and if you sign the card immediately after purchasing it,
you may still need to wait up to four hours.
That didn't apply in my case because I received the gift card
in the mail.

MVPs Discuss OBS Studio, Power BI, and More on The Friday Five!

$
0
0

How to use OBS Studio as an external encoder for Microsoft Teams and Stream Live Events

Luca Vitali is from Bologna, Italy. He has worked in IT since 1996, starting with PC management, then into AD and Exchange. Since 2008 he has focused on Office Communication Server, then Lync and  now Skype for Business and Microsoft Teams. Luca began to see the future of PBX Replacement in Microsoft UC. In 2016 Luca became Microsoft MVP Office Apps & Services. He is a member of the Unified Communications Users Group Italy and regularly writes technical articles on MS Enterprise Voice and participates as a speaker in conferences in Italy. Follow him on Twitter @Luca_Vitali

MVP Mondays - Getting to know MVPs with Mark Smith 

Mark Smith is a Microsoft Business Applications MVP specializing in Dynamics 365 and the Power Platform.  Mark currently lives in London, but spent four years working in Australia before he left to travel the world with his wife for the past 18 months before moving to London.  He originated from New Zealand hence the nz in nz365guy. Mark is on a mission to help people reach their full potential using technology. He is always open to connecting with people in the community and helping them any way he can. Follow him on Twitter @nz365guy

Convert from Seconds to Minutes with a DAX Measure

Gilbert Quevauvilliers has been working in the Power BI space since its inception. He is currently a consultant for FourMoo where he provides Power BI solutions to large and small organizations around the world. Gilbert currently lives on the Gold Coast in Australia. Follow him on Twitter @GilbertQue

Power BI is a model-based tool

Marco Russo is a business intelligence consultant and mentor. He has a founder of  SQLBI and has written several books about Business Intelligence Today, Marco focuses his time to train and consult on DAX and data modeling for Power BI and Analysis Services. Marco also teaches public classes worldwide. Marco is a regular speaker at international conferences like Microsoft Ignite, PASS Summit, and Sqlbits. Follow him on Twitter @marcorus.

Windows Embedded Compact is NOR an RTOS!

Michel Verhagen has been a Windows Embedded Compact (CE) consultant since 2000, specializing in building complex Windows Embedded Compact (CE) platforms and device drivers for industrial appliances for customers around the world. He was born in the Netherlands but since January 2006 he has been living in New Zealand where he started the company GuruCE.  Michel is very active in the public Microsoft forums dealing with Windows Embedded related questions from the community. In 2001 he was awarded MVP in Windows Development. 

Download the C# 5.0 Language Specification

$
0
0

Get your copy of the C# 5.0 Language Specification here.

OneNote 2016 is missing after an update to Office 365 Pro Plus

$
0
0

A few days ago an update to my Office 365 Pro Plus removed OneNote 2016 from my company laptop. If you want it back you can bring it back using the installer available in the article "OneNote 2016 is missing after installing Office 2019 or Office 365". Notice however that OneNote 2016 is no longer being updated with new features as stated in the article "What's the difference between OneNote and OneNote 2016?".

Reasons for installing OneNote 2016 are, for example, having local-only notebooks (OneNote for Windows 10 requires notebooks to be on-cloud) or being on older client operating systems such as Windows 7 or Windows 8.

Top Stories from the Microsoft DevOps Community – 2018.11.23

$
0
0

I hope that you've recovered from your Thanksgiving feast and you're sitting around comfortably, binge-watching sitcoms and reading the latest news in DevOps. No? Is that just me? Well, now it can be you, too, because here's what I've been reading today.

Create an Azure DevOps Services Self-Hosted Agent in Azure Using Terraform, Cloud-init—and Azure DevOps Pipelines!
The more I play with Terraform the more excited I get about it. And Graham Smith has written up a great walkthrough introducing Terraform, cloud-init and Azure Pipelines to help get me up to speed.

Tutorial: Terraforming your JAMstack on Azure with Gatsby, Azure Pipelines, and Git
I'm a big devotee of JAMstack - writing markdown and then using Azure Pipelines to render that into a simple and cheap static website. Elena Neroslavskaya has a great post introducing - you guessed it: Terraform - to the mix.

Work with remote in Git to share your code
DevOps isn't just about the tools - it's about the culture and the processes, too. But it's hard to imagine a modern workflow without Git, so this tutorial on sharing code by Mohit Goyal is incribly important.

Beyond "Lift-and-Shift": Application Modernization with Microsoft Azure
I meet a lot of people who are starting their cloud migration journey by "lifting and shifting" their existing on-premises workloads into virtual machines. That's a great start, but it's just that - a start. Stephen Abdo looks at some of the reasons to modernize your applications to take real advantage of the cloud.

Events
There are some great DevOps-focused events coming up - Red Gate's SQL in the City Streamed event will bring you up to speed on compliance in Database DevOps on December 12. And if you're in healthcare development and IT sector, don't miss the webinar Make the Most of Azure DevOps in Healthcare.

If you're in the UK, don't miss Real World DevOps, taking place next week in Manchester and London. And don’t forget the amazing Azure DevOps launch events going on around the world from now through March; there might be one near you!

Did you write an article about Azure DevOps or find some great content about DevOps on Azure? Let me know! I’m @ethomson on Twitter.


Budoucnost .NET, WPF, Visual Studia, … a mnoho novinek

$
0
0

Přijďte si s námi popovídat do pražského Microsoftu o novinkách pro vývojáře v oblasti .NET, Visual Studia, umělé inteligence a cloudu. Budeme mít téma jak pro tradiční desktopáře , tak i pro ty z vás, kteří programují pro cloud.

Neformální večerní Connect Meetup 2018 Praha bude pořádán při příležitosti konference Microsoft Connect. Zde tradičně oznamujeme mnoho novinek a představujeme spoustu nových technologií.

  • Budete mít možnost si neformálně, na místě popovídat s odborníky na různá témata včetně vývojářů pracujících v týmech, které programují .NET v Microsoftu.
  • Budeme promítat live stream z konference s novinkami a demo příklady, nechcete-li se ptát, můžete jen poslouchat nebo se koukat na novinky on-line.
  • Můžete si vzít své vlastní notebooky zkoušet novinky nebo konzultovat vaše stávající problémy

V Praze, v budově Microsoft s.r.o.,  se sejdeme 4.12.2018 večer, od 17:00 hod.
Vstup zdarma, avšak online registrace předem nutná!
Přijďte v 17:00 hod. vlastní program od 17:30, konec nejpozději v deset.
Večerní občerstvení zajištěno.

Těšíme se na viděnou!

Azure Data Architecture Guide – Blog #5: Clickstream analysis

$
0
0

We'll continue to delve into the Azure Data Architecture Guide with our fifth blog entry in this series. The previous entries for this blog series are:

Like the previous post, we'll work from a technology implementation seen directly in our customer engagements. The example can help lead you to the ADAG content to make the right technology choices for your business.

Clickstream analysis

Engage with your customers and uncover insights from data generated by clickstream logs in real-time, using Azure. Rapidly ingest incoming data through Event Hubs (or leverage Apache Kafka), process it with Spark streaming and Spark ML for predicting product recommendations, then use the Spark to Azure Cosmos DB connector to save the processed data to Cosmos DB for global distribution to your customers, wherever they are.

Clickstream analysis

As an alternative to Apache Sparks on HDInsight (Spark ML), you can use Azure Databricks—a fast, easy, and collaborative Apache Spark–based analytics service.

Highlighted services

Related ADAG articles

 

Please peruse ADAG to find a clear path for you to architect your data solution on Azure:

 

Azure CAT Guidance

"Hands-on solutions, with our heads in the Cloud!"

SQL Server 2017 on Linux: Support for third-party AD providers

$
0
0

In the past, SQL Server on Linux has required SSSD to be configured for AD Authentication to work. SSSD is a very powerful service which provides support AD authentication and integration. However, there are some third-party AD providers (such as Centrify, PBIS, and VAS) which provide additional value-added services (such as auditing) on top of basic AD authentication. These third-party AD providers do not configure SSSD when joining the AD domain.

 

There are many customers which depend on these third-party AD providers to provide single “trusted zone” for all users to access resources regardless of the host OS of said resources. In today’s hybrid environments which uses Windows server, Linux servers and Windows AD to provide authentication, ease of configuration and audit capability among others, are often highly preferred. Listening to the demand from such customers, SQL Server now supports AD authentication without SSSD.

 

To support these third-party AD providers which do not configure SSSD, SQL Server performs its own LDAP queries when looking up AD accounts. For this to work successfully, several extra configuration checks must be done, and these can be found here: https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-active-directory-third-party-providers?view=sql-server-2017. This functionality was made available in SQL Server 2017 CU10 first. But it is highly recommended to use it with SQL Server 2017 CU12 onwards.

 

Additionally, if your domain controller supports LDAPS, you can force all connections from SQL Server to the domain controllers to be over LDAPS. To check your client can contact the domain controller over ldaps, run the following bash command, “ldapsearch -H ldaps://contoso.com:3269”. To set SQL Server to only use LDAPS, run the following:

  • sudo mssql-conf set network.disablesssd true
  • sudo mssql-conf set network.forceldaps true
  • systemctl restart mssql-server

Tejas Shah - Senior Program Manager

Dylan Gray - Senior Software Engineer

PowerShell Constrained Language mode and the Dot-Source Operator

$
0
0

PowerShell Constrained Language mode and the Dot-Source Operator

PowerShell works with application control systems, such as AppLocker and Windows Defender Application Control (WDAC), by automatically running in
ConstrainedLanguage mode. ConstrainedLanguage mode restricts some exploitable aspects of PowerShell while still giving you a rich shell to run commands and scripts in. This is different from usual application white listing rules, where an application is either allowed to run or not.

But there are times when the full power of PowerShell is needed, so we allow script files to run in FullLanguage mode when they are trusted by the policy. Trust can be indicated through file signing or other policy mechanisms such as file hash. However, script typed into the interactive shell is always run constrained.

Since PowerShell can run script in both Full and Constrained language modes, we need to protect the boundary between them. We don't want to leak variables or functions between sessions running in different language modes.

The PowerShell dot-source operator brings script files into the current session scope. It is a way to reuse script. All script functions and variables defined in the script file become part of the script it is dot sourced into. It is like copying and pasting text from the script file directly into your script.

# HelperFn1, HelperFn2 are defined in HelperFunctions.ps1
# Dot-source the file here to get access to them (no need to copy/paste)
. c:ScriptsHelperFunctions.ps1
HelperFn1
HelperFn2

This presents a problem when language modes are in effect with system application control. If an untrusted script is dot-sourced into a script with full trust then it has access to all those functions that run in FullLanguage mode, which can result in application control bypass through arbitrary code execution or privilege escalation. Consequently, PowerShell prevents this by throwing an error when dot-sourcing is attempted across language modes.

Example 1:

System is in WDAC policy lock down. To start with, neither script is trusted and so both run in ConstrainedLanguage mode. But the HelperFn1 function uses method invocation which isn't allowed in that mode.

PS> type c:MyScript.ps1
Write-Output "Dot sourcing MyHelper.ps1 script file"
. c:MyHelper.ps1
HelperFn1
PS> type c:MyHelper.ps1
function HelperFn1
{
    "Language mode: $($ExecutionContext.SessionState.LanguageMode)"
    [System.Console]::WriteLine("This can only run in FullLanguage mode!")
}
PS> c:MyScript.ps1
Dot sourcing MyHelper.ps1 script file
Language mode: ConstrainedLanguage
Cannot invoke method. Method invocation is supported only on core types in this language mode.
At C:MyHelper.ps1:4 char:5
+     [System.Console]::WriteLine("This cannot run in ConstrainedLangua ...
+     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : MethodInvocationNotSupportedInConstrainedLanguage

Both scripts are untrusted and run in ConstrainedLanguage mode, so dot-sourcing the MyHelper.ps1 file works. However, the HelperFn1 function performs method invocation that is not allowed in ConstrainedLanguage and fails when run. MyHelper.ps1 needs to be signed as trusted so it can run at FullLanguage.

Next we have mixed language modes. MyHelper.ps1 is signed and trusted, but MyScript.ps1 is not.

PS> c:MyScript.ps1
Dot sourcing MyHelper.ps1 script file
C:MyHelper.ps1 : Cannot dot-source this command because it was defined in a different language mode. To invoke this command without importing its contents, omit the '.' operator.
At C:MyScript.ps1:2 char:1
+ . 'c:MyHelper.ps1'
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [MyHelper.ps1], NotSupportedException
    + FullyQualifiedErrorId : DotSourceNotSupported,MyHelper.ps1
...

And we get a dot-source error because we are trying to dot-source script that has a different language mode than the session it is being dot-sourced into.

Finally, we sign as trusted both script files and everything works.

PS> c:MyScript.ps1
Dot sourcing MyHelper.ps1 script file
Language mode: FullLanguage
This can only run in FullLanguage mode!

The lesson here is to ensure all script components run in the same language mode on policy locked down systems. If one component must run in FullLanguage mode, then all components should run in FullLanguage mode. This means validating that each component is safe to run in FullLanguage and indicating they are trusted to the application control policy.

So this solves all language mode problems, right? If FullLanguage is not needed then just ensure all script components run untrusted, which is the default condition. If they require FullLanguage then carefully validate all components and mark them as trusted. Unfortuantely, there is one case where this best practice doesn't work.

PowerShell Profile File

The PowerShell profile file (profile.ps1) is loaded and run at PowerShell start up. If that script requires FullLanguage mode on policy lock down systems, you just validate and sign the file as trusted, right?

Example 2:

PS> type c:users<user>DocumentsWindowsPowerShellprofile.ps1
Write-Output "Running Profile"
[System.Console]::WriteLine("This can only run in FullLanguage!")
# Sign file so it is trusted and will run in FullLanguage mode
PS> Set-AuthenticodeSignature -FilePath .Profile.ps1 -Certificate $myPolicyCert
# Start a new PowerShell session and run the profile script
PS> powershell.exe
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.
C:Users<user>DocumentsWindowsPowerShellprofile.ps1 : Cannot dot-source this command because it was defined in a different language mode. To invoke this command without importing its contents, omit the '.' operator.
At line:1 char:1
+ . 'C:Users<user>DocumentsWindowsPowerShellprofile.ps1'
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [profile.ps1], NotSupportedException
    + FullyQualifiedErrorId : DotSourceNotSupported,profile.ps1

What gives? The profile.ps1 file was signed and is policy trusted. Why the error?
Well, the issue is that PowerShell dot-sources the profile.ps1 file into the default PowerShell session, which must run in ConstrainedLanguage because of the policy. So we are attempting to dot-source a FullLanguage script into a ConstrainedLanguage session, and that is not allowed. This is a catch 22 because if the profile.ps1 is not signed, it may not run if it needs FullLanguage privileges (e.g., invoke methods). But if you sign it, it still won't run because of how it is dot-sourced into the current ConstrainedLanguage interactive session.

Unfortunately, the only solution is to keep the profile.ps1 file fairly simple so that it does not need FullLanguage, and refrain from making it trusted. Keep in mind that this is only an issue when running with application control policy. Otherwise, language modes do not come into play and PowerShell profile files run normally.

Paul Higinbotham
Senior Software Engineer
PowerShell Team

Create Azure DevOps CI/CD pipeline using Azure DevTest Lab without access to environment ARM templates.

$
0
0

The ability to use the Azure DevTest labs within a development inner loop and testing has been documented , but this post will look at how the DevTest labs can be used in the Azure DevOps build and release pipelines. The basic flow is to have a build pipeline to that will build the application code, create the necessary environment in DevTest Labs, deploy the application to the DevTest Lab environment, then test the code. Once build has been completed successfully, the release pipeline will use the build artifacts to deploy staging, or production. One of the necessary premises is that all the information needed to recreate the “tested” ecosystem is available within the build artifacts, including the configuration of the Azure resources.  As Azure resources incur a cost when used, companies tend to want to either control or track the use of these resources. In some situations, the Azure RM templates that are used to create and configure the resources may be managed by another department such as IT, which could store them in a different repository. This leads to an interesting situation where a build will be created and tested, and both the code and the configuration will need to be stored within the build artifacts to properly recreate the system in production. Using DevTest Labs during the build/test phase, we can add the correct ARM templates and supporting files to the build sources, so that during the release phase the exact configuration used in testing  is deployed to production. The “Create Azure DevTest Labs Environment” task with the proper configuration will save the ARM templates within the build artifacts. For this example I’ll be using the code from the Tutorial: Build a .NET Core and SQL Database web app in Azure App Service, to deploy and test the web app in Azure.

Code and Configuration in separate repositories

Overall Flow

Setup Azure Resource

There are a couple of items that will need to be created beforehand:

  • Two repositories, the first with the code from the tutorial, the second will contain the ARM template (configuration).
  • A Resource Group for deployment of the production code and configuration.
  • A DevTest Lab (TestLab) will need to be setup with a connection to the configuration repository for the build pipeline. I’ve included the necessary ARM template that will create the Web App and SQL Server to support the Tutorial: Build a .NET Core and SQL Database web app in Azure App Service. The ARM template will need to be checked into the configuration repository as azuredeploy.json with the metadata.json to allow DevTest lab to recognize and deploy the template.

The DevTest Lab is where the build pipeline will create the environment and deploy the code for testing.

Setup Build pipeline

In Azure DevOps, create a new build pipeline using the code from the Tutorial: Build a .NET Core and SQL Database web app in Azure App Service using the “ASP.NET Core” template which will populate the necessary task to build, test, and publish the code.

Two additional tasks will need to be added to create the environment in DevTest Lab and deploy to the environment.

The “Create Azure DevTest Labs Environment” task before the “Test” task. In the “create environment” task, use the pulldowns to select the appropriate Azure RM Subscription, Lab Name, Repository Name, and Template Name (which shows the folder name where the environment is stored). I would highly recommend using the pulldowns; if you manually enter the information, you will need the fully qualified Azure Resource Id for this task to work. The task displays the “friendly” names instead of the resource IDs. The environment name is the displayed name shown within DevTest labs, this should be a unique name for each build ie “TestEnv$(Build.BuildId)”. Either the Parameters File or the Parameters section can be used to pass information into the ARM template – see Additional information / Azure Resource Management Parameters for an example. The “Create output variables based on the environment template output?” to allow the output to be recognized by the build pipeline. The “Create artifact based on the environment template output?” will need to be enabled with the appropriate output names to allow the ARM templates to be downloaded to the build artifact. This can be done through the advanced option.

 

The second task is the “Azure App Service Deploy”, which will be added after the Create task above. The App type will be “Web App” and the App Service name set to $(WebSite) to deploy the app to the app service within the DTL Environment that was created.

Setup Release pipeline

In the release pipeline, the two tasks are the “Azure Deployment: Create Or Update Resource Group action” and “Deploy Azure App Service”.  The Create of Update Resource Group Action will need:

  • The Azure Subscription where the production resource group is located.
  • The action will be “Create or update resource group”.
  • The name of the Resource Group.
  • The location of the Resource Group.
  • The template location is a “linked artifact”.
  • The template is “$(System.DefaultWorkingDirectory)/_<Build Name>/drop/DeployedEnvironment/azuredeploy.json”.
  • The Override template parameters for the ARM template – see Additional information / Azure Resource Management Parameters for an example.

The rest of the options can be left with the defaults. If the ARM template includes linked templates then a custom resource group deployment will need to be implemented. The second task “Deploy Azure App Service” will need the Azure Subscription, the App type will be Web App; and the App Service name we’ve setup as $(WebSite). The rest can be left to the defaults.

Test Run

Now that both pipelines are set up, manually queue up a build and see it work. The next step is to set the appropriate trigger for the build and connect the build to the release pipeline.

Have a question? Check out the answers or ask a new one at MSDN forum.

Roger Best, Senior Software Engineer

Roger is part of the Visual Studio and .NET engineering team focused on Visual Studio and Azure customers.  He has been at Microsoft for over 20 years, focusing on developer technologies for the past decade or so.  In his spare time, he watches too many movies, and tries to survive triathlons

 

Additional information

Demo Build / Release variables

  • AdministratorLogin: Administrator Name
  • AdministratorPassword: Administrator Password – secret type
  • SqlDbName: SQL database name – lower case only
  • SqlSrvName: SQL server name
  • WebSite: App Service name

Azure Resource Management Parameters

-hostingPlanName 'hostplan$(Build.BuildId)' -webSiteName '$(WebSite)' -sqlServerName '$(SqlSrvName)' -administratorLogin '$(AdministratorLogin)' -administratorLoginPassword '$(AdministratorPassword)' -databaseName '$(SqlDbName)'

DevTest Lab Environment metadata information (metadata.json)

{
"itemDisplayName": "NET Core application with SQL Db",
"description": "This template creates an Azure Web App with SQL DB."
}

Azure ARM Template for Web App with SQL Server (azuredeploy.json)

{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"hostingPlanName": {
"type": "string",
"minLength": 1
},
"webSiteName": {
"type": "string",
"defaultValue": "testwebapp"
},
"sqlServerName": {
"type": "string",
"defaultValue": "testsqlsrv"
},
"skuName": {
"type": "string",
"defaultValue": "F1",
"allowedValues": [
"F1",
"D1",
"B1",
"B2",
"B3",
"S1",
"S2",
"S3",
"P1",
"P2",
"P3",
"P4"
],
"metadata": {
"description": "Describes plan's pricing tier and instance size. Check details at https://azure.microsoft.com/en-us/pricing/details/app-service/"
}
},
"skuCapacity": {
"type": "int",
"defaultValue": 1,
"minValue": 1,
"metadata": {
"description": "Describes plan's instance count"
}
},
"administratorLogin": {
"type": "string"
},
"administratorLoginPassword": {
"type": "securestring"
},
"databaseName": {
"type": "string"
},
"collation": {
"type": "string",
"defaultValue": "SQL_Latin1_General_CP1_CI_AS"
},
"edition": {
"type": "string",
"defaultValue": "Basic",
"allowedValues": [
"Basic",
"Standard",
"Premium"
]
},
"maxSizeBytes": {
"type": "string",
"defaultValue": "1073741824"
},
"requestedServiceObjectiveName": {
"type": "string",
"defaultValue": "Basic",
"allowedValues": [
"Basic",
"S0",
"S1",
"S2",
"P1",
"P2",
"P3"
],
"metadata": {
"description": "Describes the performance level for Edition"
}
},
"_artifactsLocation": {
"type": "string",
"defaultValue": ""
},
"_artifactsLocationSasToken": {
"type": "securestring",
"defaultValue": ""
}
},
"variables": {
},
"resources": [
{
"name": "[parameters('sqlserverName')]",
"type": "Microsoft.Sql/servers",
"location": "[resourceGroup().location]",
"tags": {
"displayName": "SqlServer"
},
"apiVersion": "2014-04-01-preview",
"properties": {
"administratorLogin": "[parameters('administratorLogin')]",
"administratorLoginPassword": "[parameters('administratorLoginPassword')]"
},
"resources": [
{
"name": "[parameters('databaseName')]",
"type": "databases",
"location": "[resourceGroup().location]",
"tags": {
"displayName": "Database"
},
"apiVersion": "2014-04-01-preview",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers/', parameters('sqlserverName'))]"
],
"properties": {
"edition": "[parameters('edition')]",
"collation": "[parameters('collation')]",
"maxSizeBytes": "[parameters('maxSizeBytes')]",
"requestedServiceObjectiveName": "[parameters('requestedServiceObjectiveName')]"
}
},
{
"type": "firewallrules",
"apiVersion": "2014-04-01-preview",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers/', parameters('sqlserverName'))]"
],
"location": "[resourceGroup().location]",
"name": "AllowAllWindowsAzureIps",
"properties": {
"endIpAddress": "0.0.0.0",
"startIpAddress": "0.0.0.0"
}
}
]
},
{
"apiVersion": "2015-08-01",
"name": "[parameters('hostingPlanName')]",
"type": "Microsoft.Web/serverfarms",
"location": "[resourceGroup().location]",
"tags": {
"displayName": "HostingPlan"
},
"sku": {
"name": "[parameters('skuName')]",
"capacity": "[parameters('skuCapacity')]"
},
"properties": {
"name": "[parameters('hostingPlanName')]"
}
},
{
"apiVersion": "2015-08-01",
"name": "[parameters('webSiteName')]",
"type": "Microsoft.Web/sites",
"location": "[resourceGroup().location]",
"dependsOn": [
"[resourceId('Microsoft.Web/serverFarms/', parameters('hostingPlanName'))]"
],
"tags": {
"[concat('hidden-related:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]": "empty",
"displayName": "Website"
},
"properties": {
"name": "[parameters('webSiteName')]",
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', parameters('hostingPlanName'))]"
},
"resources": [
{
"apiVersion": "2015-08-01",
"type": "config",
"name": "connectionstrings",
"dependsOn": [
"[resourceId('Microsoft.Web/Sites/', parameters('webSiteName'))]"
],
"properties": {
"DefaultConnection": {
"value": "[concat('Data Source=tcp:', reference(resourceId('Microsoft.Sql/servers/', parameters('sqlserverName'))).fullyQualifiedDomainName, ',1433;Initial Catalog=', parameters('databaseName'), ';User Id=', parameters('administratorLogin'), '@', parameters('sqlserverName'), ';Password=', parameters('administratorLoginPassword'), ';')]",
"type": "SQLServer"
}
}
}
]
},
{
"apiVersion": "2014-04-01",
"name": "[concat(parameters('hostingPlanName'), '-', resourceGroup().name)]",
"type": "Microsoft.Insights/autoscalesettings",
"location": "[resourceGroup().location]",
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]": "Resource",
"displayName": "AutoScaleSettings"
},
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
],
"properties": {
"profiles": [
{
"name": "Default",
"capacity": {
"minimum": 1,
"maximum": 2,
"default": 1
},
"rules": [
{
"metricTrigger": {
"metricName": "CpuPercentage",
"metricResourceUri": "[concat(resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]",
"timeGrain": "PT1M",
"statistic": "Average",
"timeWindow": "PT10M",
"timeAggregation": "Average",
"operator": "GreaterThan",
"threshold": 80.0
},
"scaleAction": {
"direction": "Increase",
"type": "ChangeCount",
"value": 1,
"cooldown": "PT10M"
}
},
{
"metricTrigger": {
"metricName": "CpuPercentage",
"metricResourceUri": "[concat(resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]",
"timeGrain": "PT1M",
"statistic": "Average",
"timeWindow": "PT1H",
"timeAggregation": "Average",
"operator": "LessThan",
"threshold": 60.0
},
"scaleAction": {
"direction": "Decrease",
"type": "ChangeCount",
"value": 1,
"cooldown": "PT1H"
}
}
]
}
],
"enabled": false,
"name": "[concat(parameters('hostingPlanName'), '-', resourceGroup().name)]",
"targetResourceUri": "[concat(resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
}
},
{
"apiVersion": "2014-04-01",
"name": "[concat('ServerErrors ', parameters('webSiteName'))]",
"type": "Microsoft.Insights/alertrules",
"location": "[resourceGroup().location]",
"dependsOn": [
"[resourceId('Microsoft.Web/sites/', parameters('webSiteName'))]"
],
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/sites/', parameters('webSiteName'))]": "Resource",
"displayName": "ServerErrorsAlertRule"
},
"properties": {
"name": "[concat('ServerErrors ', parameters('webSiteName'))]",
"description": "[concat(parameters('webSiteName'), ' has some server errors, status code 5xx.')]",
"isEnabled": false,
"condition": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.ThresholdRuleCondition",
"dataSource": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleMetricDataSource",
"resourceUri": "[concat(resourceGroup().id, '/providers/Microsoft.Web/sites/', parameters('webSiteName'))]",
"metricName": "Http5xx"
},
"operator": "GreaterThan",
"threshold": 0.0,
"windowSize": "PT5M"
},
"action": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleEmailAction",
"sendToServiceOwners": true,
"customEmails": []
}
}
},
{
"apiVersion": "2014-04-01",
"name": "[concat('ForbiddenRequests ', parameters('webSiteName'))]",
"type": "Microsoft.Insights/alertrules",
"location": "[resourceGroup().location]",
"dependsOn": [
"[resourceId('Microsoft.Web/sites/', parameters('webSiteName'))]"
],
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/sites/', parameters('webSiteName'))]": "Resource",
"displayName": "ForbiddenRequestsAlertRule"
},
"properties": {
"name": "[concat('ForbiddenRequests ', parameters('webSiteName'))]",
"description": "[concat(parameters('webSiteName'), ' has some requests that are forbidden, status code 403.')]",
"isEnabled": false,
"condition": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.ThresholdRuleCondition",
"dataSource": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleMetricDataSource",
"resourceUri": "[concat(resourceGroup().id, '/providers/Microsoft.Web/sites/', parameters('webSiteName'))]",
"metricName": "Http403"
},
"operator": "GreaterThan",
"threshold": 0,
"windowSize": "PT5M"
},
"action": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleEmailAction",
"sendToServiceOwners": true,
"customEmails": []
}
}
},
{
"apiVersion": "2014-04-01",
"name": "[concat('CPUHigh ', parameters('hostingPlanName'))]",
"type": "Microsoft.Insights/alertrules",
"location": "[resourceGroup().location]",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
],
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]": "Resource",
"displayName": "CPUHighAlertRule"
},
"properties": {
"name": "[concat('CPUHigh ', parameters('hostingPlanName'))]",
"description": "[concat('The average CPU is high across all the instances of ', parameters('hostingPlanName'))]",
"isEnabled": false,
"condition": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.ThresholdRuleCondition",
"dataSource": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleMetricDataSource",
"resourceUri": "[concat(resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]",
"metricName": "CpuPercentage"
},
"operator": "GreaterThan",
"threshold": 90,
"windowSize": "PT15M"
},
"action": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleEmailAction",
"sendToServiceOwners": true,
"customEmails": []
}
}
},
{
"apiVersion": "2014-04-01",
"name": "[concat('LongHttpQueue ', parameters('hostingPlanName'))]",
"type": "Microsoft.Insights/alertrules",
"location": "[resourceGroup().location]",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]"
],
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]": "Resource",
"displayName": "AutoScaleSettings"
},
"properties": {
"name": "[concat('LongHttpQueue ', parameters('hostingPlanName'))]",
"description": "[concat('The HTTP queue for the instances of ', parameters('hostingPlanName'), ' has a large number of pending requests.')]",
"isEnabled": false,
"condition": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.ThresholdRuleCondition",
"dataSource": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleMetricDataSource",
"resourceUri": "[concat(resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]",
"metricName": "HttpQueueLength"
},
"operator": "GreaterThan",
"threshold": 100.0,
"windowSize": "PT5M"
},
"action": {
"odata.type": "Microsoft.Azure.Management.Insights.Models.RuleEmailAction",
"sendToServiceOwners": true,
"customEmails": []
}
}
},
{
"apiVersion": "2014-04-01",
"name": "[parameters('webSiteName')]",
"type": "Microsoft.Insights/components",
"location": "East US",
"dependsOn": [
"[resourceId('Microsoft.Web/sites/', parameters('webSiteName'))]"
],
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/sites/', parameters('webSiteName'))]": "Resource",
"displayName": "AppInsightsComponent"
},
"properties": {
"ApplicationId": "[parameters('webSiteName')]"
}
}
],
"outputs": {
"EnvironmentLocation":{
"type": "string",
"value": "[parameters('_artifactsLocation')]"
},
"EnvironmentSAS":{
"type": "string",
"value": "[parameters('_artifactsLocationSasToken')]"
},
"appServiceName":{
"type": "string",
"value": "[parameters('webSiteName')]"
},
"sqlSrvName":{
"type": "string",
"value": "[parameters('sqlserverName')]"
}
}
}

Viewing all 5308 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>