Quantcast
Channel: MSDN Blogs
Viewing all 5308 articles
Browse latest View live

DO NOT DELETE “Style Resource Readers” Group

$
0
0

So the other day at my customer, the site collection used for training just stopped working, that’s what is said.

What stopped working is creating any sub sites, list, libraries.

The error was very vague and unhelpful. The SPS Logs simply showed  IO File not found shortly after trying to activate the “WikipageHomePage” feature.

While users with permissions could not create a site, site collection administrators could. So this is simply a permissions issue.

In researching this issue there were lots of similar cases. Most had to do with lack of permissions to the style/images/master page libraries.

This is when I found that the “Style Resource Readers” which by the way has a description that says “DO NOT DELETE” was missing.

This group normally has the following two accounts:

  • Everyone
  • NT AUTHORITYauthenticated users

The Group has the following permissions:

  • Read
  • Limited access

And has the following lists that is used in:

  • Site Collection Images
  • Style Library
  • Publishedlinks
  • Quick Deploy Items
  • Master Page Library

So even if you re-create the group you may still find that users still can’t create sites/lists/libraries.

That is because the hiddentaxonomylist probably has no permissions.

go to site/lists/hiddentaxonomylist

Add “Everyone” to the lists permissions.

This may not fix your problem, but it may help answer some of your problems.

Another possible way to fix the problems of deleting the group is to flip the Publishing feature off then on, both at the site level and site collection level. This should reset/re-create all the required items. Beware, it might cause additional pain in some cases.

References:

https://support.office.com/en-us/article/default-sharepoint-groups-in-sharepoint-server-50855943-c151-42bf-b6ea-07f2bb366ecc

https://sharepoint.stackexchange.com/questions/92862/style-resource-readers-allow-access-to-list-views

User with full control cannot create sub sites or projects


New Innovations for SharePoint Developers

$
0
0

This post is provided by Senior App Dev Manager, Ed Tovsen who highlights some of the new innovations available for SharePoint developers.


SharePoint development has evolved over the years since its initial release in 2001. Customization options have varied from fully trusted code, to sand-boxed solutions, to add-ins. With each new option, Microsoft has intended to simplify and control how customers customize their SharePoint environments. This past year Microsoft has continued innovating and introduced some new services and frameworks to expedite the customization of SharePoint environments. Two of the new services, PowerApps and Microsoft Flow, are designed for rapid custom development and deployment. Microsoft is finding that these types of services help developers and power users quickly tackle SharePoint requests, making SharePoint more useful and valuable for corporations. Additionally, Microsoft recently introduced a new client-side development model called SharePoint Framework (SPFx). Below are overviews for each of these new innovations and links to provide you with the specifics.

PowerApps

PowerApps is service built for developers and analysts to create online forms which connect to numerous data sources including SharePoint Online and SharePoint on-premises using a data gateway. Using either PowerApps Studio or the PowerApps web designer, you can quickly create apps that address specific needs without writing code or struggling with integration issues. PowerApps will generate custom apps that execute on all devices, including mobile, as well as the web. PowerApps is designed for a corporate environment where apps can be shared with employees.

PowerApps is tightly integrated with the new modern SharePoint experience. The modern SharePoint List menu includes a PowerApps button to create a new app for the current list. At this time, the modern SharePoint experience is only available in SharePoint Online. It will be included as part of an upcoming feature pack for SharePoint 2016.

clip_image004clip_image002

When you click on the Create an app, a pop-up will appear allowing for the app to be named. After the name is entered and the Create button clicked, the PowerApps web designer will open in the browser. Because the PowerApps web designer knows the context of the SharePoint list, it automatically creates a default app based on the schema and data of the SharePoint List. You can then customize the app to meet your business requirements.

employeeapp

Apps created using these steps will be listed as a view and can be shared or launched from within the SharePoint modern list experience. This allows you to leverage PowerApps to build custom, mobile-optimized views of SharePoint lists and share them with co-workers. Lastly, PowerApps is a cross-platform service which allows apps to run on all devices including Windows, iOS, and Android.

Microsoft Flow

flowMicrosoft Flow is a service that allows developers and analysts to create automated workflows between applications and services which can synchronize files, get notifications, collect data, and more. Using templates or starting from scratch, developers create flows to turn repetitive tasks into multistep workflows. For example, you could get an email notification every time a new item is added to a SharePoint list. Microsoft Flow connects to both SharePoint Online and SharePoint on-premises using the same data gateway as PowerApps.

Like PowerApps, Microsoft Flow is also integrated into the modern SharePoint List menu as you can see in the image above. When you click on the Create a flow, a pop-up will appear to create a flow for the SharePoint List. After selecting a template, the Flow web designer will open in the browser. Because the Flow web designer knows the context of the SharePoint list, it prefills the steps in the flow. You can then customize the flow to meet your business requirements.

searchtemplate

Microsoft Flow is the successor to SharePoint Designer for common business scenarios such as approvals, document review, and onboarding/offboarding. Going forward, it will be the default tool for building business automation in SharePoint.

SharePoint Framework

The SharePoint Framework (SPFx) is a web part and page model that enables fully supported client-side development as well as support for open source tooling. Introduced in May 2016, the SharePoint Framework is initially focused on extending the SharePoint user interface using client-side web parts. SPFx aims to solve the difficulty of keeping up with the evergreen model of SharePoint Online. SPFx provides a standardized framework to create custom user interface extensions as well as building applications on top of SharePoint Online.

Microsoft built the SharePoint Framework from ground-up using a modern web stack including TypeScript / JavaScript, HTML, and CSS. All parts of the generated artifacts are executed locally in the browser. SPFx comes with a completely new set of tooling that is platform agnostic and works on either the PC or Mac. It is based on open source technologies such as Node.js, Gulp, Webpack, and Yeoman. The SharePoint Framework and tools are used at build time to streamline the developer experience for building, packaging, and deploying.

typescriptnodejsgulpyeoman

The SharePoint Framework runs in the context of the current user and connection in the browser, not using iFrames. The controls are rendered in the normal page Document Object Model (DOM) and are responsive and accessible in nature.

loyalty

The SharePoint Framework reached General Availability (GA) in February 2017. Currently, the SharePoint Framework is only applicable for web parts running in SharePoint Online. Microsoft is planning to bring the SharePoint Framework to SharePoint 2016 on-premises during 2017, as part of a future feature pack. The SharePoint Framework roadmap also includes full page apps, which will render in full page mode and not as web parts in SharePoint.

Additional Information


Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.

TFVC to Git – How do I???

Issues with Visual Studio Team Services – 06/28 – Mitigated

$
0
0

Final Update: Wednesday, June 28th 2017 23:28 UTC

We’ve confirmed that all systems are back to normal as of 22:31 UTC.
Customers should no longer have issues in accessing their Visual Studio Team Services accounts

Thanks,
Ian


Update: Wednesday, June 28th 2017 22:32 UTC

We are continuing to investigate the intermittent failures with Visual Studio Team Services. A subset of users, approx. 224, in North Central US may experience ‘503 – Service Unavailable’ error while accessing their accounts. Our partner teams have been engaged and are assisting with the investigation.

  • Next Update: Before Thursday, June 29th 2017 00:30 UTC

Thanks,
IanSt


Initial Update: Wednesday, June 28th 2017 21:27 UTC

We are investigating intermittent failures with Visual Studio Team Services. Subset of users in North Central US may experience ‘503 – Service Unavailable’ error while accessing their accounts.

  • Next Update: Before Wednesday, June 28th 2017 22:30 UTC

Sincerely,
Rajesh

Replace ConnectionString values using transformations in Team Services

$
0
0

First off, check out this great overview https://blogs.msdn.microsoft.com/devops/2017/04/05/managing-configuration-app-settings-for-multiple-environments-in-your-cd-pipeline/

And be aware of how stinky-easy this is when deploying to Azure web sites.  Note: Creation of variables in Team Services, and slot settings in Azure are typically done by an Operations or Infrastructure/Deployment team, *not* by developers :).

And also for IIS apps https://www.visualstudio.com/en-us/docs/build/apps/cd/deploy-webdeploy-iis-deploygroups#environment-specific-configuration

 

But what about other app types that don’t have tasks that incorporate transforms?  It’s still easy, by using Colin’s ALM Build and Release Tools task. Specifically, the Replace Tokens task.

 

Here’s a quick walkthrough, using a connection string value as an example.

  1. Create a variable in each environment with the name/value pair for the connection string. The “Name” must match the name of our connection string in web.config.
  2. Add the “Replace Tokens” task to the release. This will search all files with a .config extension and perform a match against any “names” that match an environment variable.
  3. In order to retain the ability to do easy debugging locally, keep the appropriate connection string setting in the web.config.
  4. Create “Debug” and “Release” configs that will tokenize the connection string when it is built by VSTS. In our case, it will replace the actual connectionstring value with “__DefaultConnectionString__” so that the replace tokens task from #2 will be able to find/replace it.

That’s it!  Debug locally using a local database (or whatever is appropriate), and have releases flow into the release pipeline where all other connection strings will be populated based on the environment in Team Services, or the slot setting in Azure.

Easy!

 

Update to Expanded Device Targeting

$
0
0

Last year, we added support for the ability to manage bid adjustments by device type using the API. We are now relaxing the constraint for desktop bid adjustments. As this announcement explains, you can now set the bid modifier for desktop to anywhere within the range -100% to +900%. Prior to this change the supported values for desktop were 0% to +900%.

Now you can customize desktop bids the same way as other device types i.e., tablets and smartphones. It is now available in both the Campaign Management and Bulk APIs (v10 and v11).

If you have any questions in this regard, feel free to let us know your feedback at bingads-feedback@microsoft.com.

Sitefinity based Azure App Service affected after .NET Framework 4.7 update

$
0
0

Background: As previously announced here – Azure App Service will get support for .NET 4.7 by the end of June 2017. Recently we have seen instances where web apps started failing with below symptom as a result of .NET Framework 4.7 updates. If your Azure App Service is affected by below symptom, please follow below recommendations.

 

Symptoms: Sitefinity based ASP.NET Azure App Service might not work as expected and could fail with System.TypeInitializationException exception or with error “The type initializer for ‘Telerik.Sitefinity.Security.SecurityManager’ threw an exception.”

 

Cause: This problem occurs when there is a complex nested Sitefinity custom membership provider integration causing the apps to break and raise the exception. So far this symptom has been observed only with Sitefinity based ASP.NET web apps leading to app failure. We are not aware of any other ASP.NET web apps having such issue.

 

Solution: Because majority of the reports are around Sitefinity based ASP.NET web apps, to address above problem Sitefinity has suggested to follow below recommendations and install Sitefinity hotfix to deploy it to Azure App Service:

http://knowledgebase.progress.com/articles/Article/local-site-suddenly-stopped-working-after-running-the-windows-10-updates?q=Local+site+suddenly+stopped+working+after+running+the+Windows+10+Updates&l=en_US&c=Product_Group%3ASitefinity&fs=Search&pn=1

Microservices == Microdatabases

$
0
0

I’ve been playing in the microservices conversation for quite a while now, and just wanted to callout a really nice codebase/walkthrough at https://github.com/dotnet-architecture/eShopOnContainers

While we’re on the topic, the first question I often hear is “what is a microservice?” or “what is the difference between a microservice and SOA?” Seeing as how microservices doesn’t even have an actual definition, it’s more than a fair question.  If someone is familiar with SOA, then my favorite place to begin discussions is with this image, and to point out something that most folks don’t immediately apprehend.

It is very common (and usually required) to split and duplicate data across the data stores for each microservice.

This blew my mind when I first considered it, but it makes perfect sense, and is a completely reasonable (and appropriate!) thing to do.  In the image below, we’ll want to service/release/version/enhance each of the microservices independently.  Let’s take Ordering for example. If we don’t store enough Identity information in the Ordering service to allow for folks to create orders while the Identity service is being rev’d, we’ve still got a horizontal dependency that makes those 2 services monolithic in practice.

Microservices == Microdatabases + duplication 🙂


[Administração] Instant File Initialization

$
0
0

Quando atribuimos a permissão de Perform Volume Maintenance Tasks para a conta que inicia o serviço do SQL Server (em Local Security Policies do Windows), permitimos que o SQL Server aloque espaço para arquivos de dados quase que instantaneamente; esse recurso é conhecido por “Instant File Initialization” (IFI).

Na prática isso facilita muito o processo de criação ou crescimento de arquivos de dados, restauração de backups, etc; mas o IFI não tem efeito em arquivos de log.

Muitos cliente perguntam o motivo do IFI tornar o processo de alocação de espaço tão rápido e o motivo é que, por padrão, quando o SQL Server precisa alocar espaço para um arquivo, ele preenche o novo espaço com zeros. Quanto maior é a alocação de espaço, mais zeros o SQL Server tem que escrever, portanto, maior a espera. Quando o IFI está habilitado o SQL Server não preenche o novo espaço com zeros e assim o processo é concluído quase que instantaneamente.

Para visualizar o comportamento do IFI podemos utilizar um pequeno disco e criar nele um grande arquivo .txt com algum conteúdo; no meu cenário, criei um arquivo com a frase “O Rato Roeu a Roupa do Rei de Roma” inúmeras vezes. Em seguida podemos apagar o .txt (ou movimenta-lo para outro disco).

Usando um editor hexadecimal, se verificarmos o conteúdo do disco veremos que apesar de termos apagado o arquivo .txt, o conteúdo do arquivo ainda está gravado lá:

Agora com o Instant File Initialization habilitado, criamos um banco SQL Server nesse disco; depois disso se abrirmos o editor hexadecimal e analisarmos o conteúdo dos datafiles veremos (para espanto de muitos) que o novo .mdf parece estar preenchido com milhares de “O Rato Roeu a Roupa do Rei de Roma“. Como o IFI estava habilitado o SQL Server não teve o trabalho extra de preencher o espaço alocado com zeros para o novo .mdf, portanto os dados do antigo .txt não foram sobrescritos. Conforme o SQL Server começar a receber novos dados nesse banco, os dados antigos serão sobrescritos.

Se repetirmos o mesmo teste com o IFI desabilitado, veremos que a criação do banco demora mais e no editor hexadecimal veremos que o SQL Server preencheu com zeros o novo espaço alocado:

Pontos adicionais

  • Após conceder a permissão Perform Volume Maintenance Tasks é necessário reiniciar o serviço do SQL Server;
  • Para instâncias clusterizadas é preciso conceder essa permissão em todos os nós;
  • No SQL Server 2016 é possível conceder essa permissão durante o setup:

 

Referências adicionais

Database Instant File Initialization: https://docs.microsoft.com/en-us/sql/relational-databases/databases/database-instant-file-initialization

SetFileValidData function: https://msdn.microsoft.com/en-us/library/aa365544(VS.85).aspx

 

Silas

The code and techniques described in this blog are presented to the reader ‘as is’, without any warranty, whether express or implied, of its accuracy, completeness, fitness for a particular purpose, title or non-infringement, and none of the third-party products or information mentioned in the work are authored, recommended, supported or guaranteed by any of the authors or Microsoft. Further, the authors shall not be liable for any damages you may sustain by using this information, whether direct, indirect, special, incidental or consequential, even if it has been advised of the possibility of such damages. Your use of the information contained in these pages, is at your sole risk.

FAQ : Azure marketplace for Web Apps

$
0
0

Here is a list of frequently asked questions designed to provide a better understanding of the Azure marketplace for Web

  1. How do I submit my Web Application to Windows Web Application Gallery?
    A. You can submit your application here. Before you submit the application please read through the guidelines and process for submission in this article.
  2. Why is Deployment option connected to a github repository when I deploy an application like WordPress from the marketplace?
    A. The deployment process for application from the marketplace uses a GIT deployment method. This makes it easier for app owners to push updates to the application as quickly as possible and have it available to Azure users. Hence we use a github repository configured by the app owner with the application code that is deployed during provisioning of the application.
  3. How long does it take for the Web Application Gallery Team to validate the application?
    A. Once the application is submitted, it will take 3-5 business days for us to validate the application and send you the status.
  4. How do I build a package for Web Application Gallery?
    A. Please refer to this article on how to to build a package for Azure marketplace.
  5. . How do I test my application for Windows App Gallery?
    A. Find the process to test your application in the following article.
  6. An application was removed from the marketplace and how do I deploy the same solution?
    A. If an application is removed from the marketplace , this means it is no longer supported by the application owner in the marketplace. In such cases we remove the application if it does have support from the application owner to maintain fixes or issues. If you want to deploy the same application , you can. Follow these steps to do so :
    • Create an empty Web App and any additional resources such as MySQL , SQL DB etc that the application may need.
    • Access the web application file storage and deploy the code via FTP or GIT.
    • Browse the application and complete the installation of the application based on the documentation provided in the application framework documentation.
    • If you run into issues , please report these issues in the community forums for the application being used .

HealthVault S66 release

$
0
0

HealthVault S66 release has been deployed to PPE and will be available next week in production environment. There is no changes impacting to HealthVault developers. Please use the MSDN Forum to report any issues.

Azure Logging and Auditing–Turn Chaos into Order

$
0
0

imageAzure is a big thing – so big that a lot of your tell me it’s hard to keep up.image

But you’re also IT pro’s, you’re used to learning new things – IT definitely isn’t the profession you chose to not learn new things.

Indeed – this is the life we’ve chosen.

And that’s a good thing. But help is appreciated – especially when it comes to Azure Logging.

Azure Logging can be complex – there’s a lot of moving parts and a lot of different types of logs.

Wouldn’t be good to have a basic schema, a basic understanding, of all the different types of logs and how you can use all those logs?

That’s where the Azure Logging and Auditing comes in! We heard you – your challenges, frustrations, blockers and overhead generators. We thought about those and put this paper together to ease your challenges, reduce your frustrations, remove your blockers and lower your overhead.

Check out Azure Logging and Auditing and let us know what we need to add!

And if you like it, let us know about that too Smile

Thanks!

Tom

Tom Shinder
Program Manager, Azure Security
@tshinder | Facebook | LinkedIn | Email | Web | Bing me! | GOOG me

Premier ITSM Testing Services: Anatomy of a Lab Engagement (Part 2)

$
0
0

Sanjeev Gogna and Charles Ofori, Senior Application Development Managers, discuss the importance of testing and spotlight the resources available to our customers as part of a testing engagement.


This is the second of a three part series to discuss the importance of testing and spotlight the resources available through Premier Support and our IT Support Management (ITSM) team. In the first blog post we introduced the features of Premier testing services. In this blog, we will focus on the stages of such a lab engagement – planning, execution and post lab activities.

Planning

This phase will typically last a month to a couple of months depending on the complexity. In this phase we work with our customers to clearly define the objectives and goals of the lab engagement. We also focus on determining their physical and software topology. An initial Test Plan is obtained from the customer for conversations to be initiated with a Microsoft Test Architect.

  • Determine lab goals and objectives
  • Determine hardware and software requirements for the testing topology
  • Determine the high level test plan and application architecture
  • Determine Microsoft Subject Matter experts needed to help execute the test engagement

Here are some of the questions that help us to properly scope out the items listed above. This will help you see the level of detail we need to capture and understand for a successful lab engagement.

I Background

This is where the customer provides some basics about the engagement itself, to include:

  • Customer Background (Market segment, etc.)
  • Business Need (Marketing numbers, hardware requirements, porting, etc.)
  • Engagement Objectives (Goal statement—2-3 sentence description of goals)
  • Success Criteria (Listing of metrics for success—concurrent users, pages/sec, etc.)

II Architecture

  • What is the architecture of the application you will be testing? How many tiers (Web Application 3-tier, Client Server 2-tier, Database Application 1-tier, Web Services n-tier)? Is the database SQL Server or Oracle? Will you test using DBLib, ODBC, OLE-DB, or OCI?
  • Is the middle tier using MTS, IIS, ASP.NET, COM, JSP, ISAPI, DCOM, or ASP? Is it written in VB, C++, or Java?
  • What application protocols are you testing (HTTP, HTTP Secure Sockets, FTP, .NET, RMI (Java), EJB, POP3, SMTP, LDAP etc…)?
  • Is the front end thin or fat? Using IE, Netscape, DHTML, ActiveX, Java Script, VB Script, VB, VC++ or Java?
  • Which tier do you wish to test? Will this be a db (ODBC, OLEDB) benchmark or a middle tier (IIS) benchmark? Or a functional front-end test (GUI)?
  • Will Wide Area Network (WAN) connectivity be simulated? If so, what are the metrics for the WAN links to be simulated (i.e. packet loss, latency, speed, etc)?

III Testing Requirements

  • Do you have scenarios identified that you want to test? How many scripts do you expect to run?
  • Will you need actual physical clients (one instance per client machine) or virtual clients (multiple instances per client machine)?
  • Have you tested this application before? If so, what has been used for testing (3rd party software?). Have you used Mercury LoadRunner/WinRunner or Segue SilkPerformer/SilkTest or IBM Rational Software for any testing?

IV Hardware Requirements

  • What kind of hardware is required for this test? How many servers will be needed?
    • How many client machines will you need?
    • What are the configuration requirements? DB Server (disks, disk configuration, RAM, processors),
    • Web Servers (how many, disks, disk configuration, RAM, processors),
    • Application Servers (how many, disks, disk configuration, RAM, processors),
    • Client machines (how many, disks, disk configuration, RAM, processors)? Are there any external (SAN or SCSI) disk requirements needed for the servers i.e. database? If so, what type and what is the configuration?
  • Do you have any load balancing requirements? If so, what type (software or hardware)?
  • What OS and IE version needs to be loaded on which machines (Database server, Web servers, Application servers, Client Servers etc.)? Would you like all the critical windows updates installed on all of the machines? *All other software (SQL, BackOffice, Exchange etc…) beyond the OS and IE must be installed by the Engagement Owner or the Customer.
  • Will you be bringing any equipment? If so, we need lots of details on this (power, footprint, etc.)
  • How many people will be attending? (Please list their names and role if possible)
  • How will you be bringing your data? (CD, DVD, DLT Tape, Laptop, or Hard drive)? If you are bringing a DLT tape, what type will it be? If you are bringing a hard drive, what type will it be?

V. Knowledge Resources Required

  • Are any technical resources required to attend during the engagement for the purposes of consultation? We will need a detailed test plan identifying which days you need what type of resource so that they can be scheduled accordingly.

Based on the info above, our consultants with create the environment needed and install the appropriate Microsoft software.

Execution Phase

This phase typically lasts 2-3 weeks again depending on complexity. In most cases the engagement is delivered onsite at Microsoft labs.

  • Customer will be given remote access to the lab environment so that they can install any non-Microsoft software and run a smoke test validation.
  • Customers will then travel to our lab facilities and work with their assigned Microsoft test architect to set up a plan for daily test runs. They are also able to conduct remote testing, this involves an extra step of understanding the impact of latency of test scenarios and any mitigations that need to be incorporated.
  • Establish a baseline run to compare against. Once the baseline has been established, each iteration will be measured against the baseline run to gauge if it exceeded the metrics of the baseline. The Performance Acceptance Criteria will be the baseline run.
  • After the baseline is established, we test for each scenario on the test plan one by one. For a typical two-week execution, it follows the schedule below:

flow

In a three-week engagement, the final week is leveraged for additional summary reports, setting up final closing call, which may sometimes include other remote stakeholders and developers who can directly benefit from the engagement’s outcome. During the closing call additional recommendations may be discussed along with action plans to advance the learnings obtained from the engagement with Microsoft Premier Developer.


Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.

Application Insights – Advisory 06/20

$
0
0
We’ve delayed switching our email delivery system for all Application Insights related emails this week due to complications with the new system. Our team is actively resolving the issues prior to switching to the new system. With the holiday weekend upon us, we will push the switchover out until 7/05/2017.

The result will be the same as stated previously, Application Insights emails will be delivered from the following email address: ai-noreply@applicationinsights.io, instead of ai-noreply@microsoft.com.


Starting from 7/05/2017, we will begin switching all Application Insights emails to be sent from ai-noreply@applicationinsights.io. Customers might also notice small cosmetic changes. The work is expected to be completed by 7/07/2017.


Please refer to the following documentation for any addition details/information regarding Application Insights: https://docs.microsoft.com/en-us/azure/application-insights/

-Vitaliy


We are working on switching our email delivery system for all Application Insights related emails. As part of this change, Application Insights emails will be delivered from the following email address: ai-noreply@applicationinsights.io, instead of ai-noreply@microsoft.com. Starting from 6/28/2017 20:00 UTC, all Application Insights emails will be sent from ai-noreply@applicationinsights.io. Customers might also notice small cosmetic changes.

Please refer to the following documentation for any addition details/information regarding Application Insights: https://docs.microsoft.com/en-us/azure/application-insights/

-Vitaliy

未来の政策や規制で AI の課題にどのように対処するか

$
0
0

2017 年 4 25 Paul Nicholas – Trustworthy Computing、シニア ディレクター

このポストは「 How future policy and regulations will challenge AI 」の翻訳です。

先日、サイバーセキュリティへの人工知能 (AI) の組み込みがどのくらい急速に進められるかについてお伝えしました。しかしながら、テクノロジ革命の速度はたいてい、我々が考える程速くありません。我々は、スプートニクが打ち上げられた 1957 年から iPhone が発表された 2007 年まで、特定の瞬間だけに注目し、それらを「革新的変化」と呼んでしまう傾向があります。それまでに、イノベーション、実装、規制など、さまざまな地道な段階を通過して初めて「革新的変化」の瞬間を迎えるという事実を正しく理解しようとしないのです。それでは、AI 開発のこの反復的かつ、どちらかというと地味なプロセスから我々は何を期待することができるでしょうか。テクノロジの進歩だけでなく、テクノロジと国の政策決定プロセスの相互作用の観点からも考えてみましょう。

2 つの観点があります。重なる部分もあるものの、明確に異なっています。1 つ目の観点は、情報通信テクノロジ (ICT) やその応用が法律よりも速く進化するという事実に関するものです。ここ数年、ソーシャル メディア アプリや配車アプリなどの例において、進化の速さにより以下のような規制に関する経験が確認されています。

  1. イノベーション: R&D プロセスで 1 つまたは多数の現実的なテクノロジ オプションが導き出される。
  2. 実装: これらのオプションが現実世界で利用され、経験を通して改良されていき、主要なグローバル市場で普及し始める。
  3. 規制: 現状維持目的や、新たなカテゴリの問題 (国境を超えるデータの流れなど) に対応する目的で政府が介入する。
  4. 予期しない影響: 政策とテクノロジの関わり合いにより、一方または両方に偶発的に悪影響を及ぼす (ワッセナー アレンジメントがサイバーセキュリティの R&D に及ぼす影響など)

AI も同様の道をたどる可能性があります。しかしながら、E コマースや共有経済とは異なり (逆にナノテクノロジや遺伝子工学と同様に)AI は人々の不安をかき立てるため、規制による介入が早い段階で行われる可能性があります。例えば、特定の分野 (防衛や製薬など) AI 利用が限定的にしか採用されないとしたら、AI の一般的適用よりも容易に管理/制御できると判断されるかもしれません。しかし、特に AI が革新的かつ創造的な飛躍を遂げる可能性を秘めていると考えた場合、本当にこのような限定利用が課されるでしょうか。まず無理でしょう。であれば、さらなる管理が必要になります。「未知の未知」である 4 つ目の段階 (予期しない影響) はともかくとして、3 つ目の段階 (規制) では、AI のように前例がなく変化しやすいものを法的に定義する必要があるため、必然的に問題が発生することはほぼ避けられないでしょう。したがって、AI と規制の関わり合いの基礎段階においても、イノベーター、実装者、および規制当局を巻き込んでの問題が数多く発生するだろうと思われます。

2 つ目の観点は、もっと AI 固有で、AI 能力の表現方法に関わるものです。これは、以下の 3 つの基礎段階に分類できると考えます。

  1. 識別: よりスマートなセンサーの作成
  2. 方向付け: 人間の意思決定の自動化
  3. 委任: 完全に独立した意思決定の実現

よりスマートなセンサーは、さまざまな形で (特に、モノのインターネット (IoT) の一部として) 登場することになるでしょう。また、それらが集約したデータは、プライバシーにさまざまな影響を及ぼすことになるでしょう。20 世紀の「旧式型レンズ」は、ナンバー プレートや人間の顔を識別できるシステムに既に接続されていますが、真のスマート センサーは、「冷蔵庫に何が入っているか」、「買い物リストに何が含まれているか」、「どこに行く予定か」、「誰に会う予定か」など、私たちの情報をほぼすべて把握することができます。よりスマートなセンサーのこの「集約化」「ネットワーク化」の側面こそが、政策決定者にとっての最初の AI 課題の中核になります。例えば、これらのセンサーは、役立つ情報を事前に提供するために、「人々が次に何を行う可能性があるか」などを予測できる機能を持ち始め、期せずして「パノプティコン (全展望監視システム)」を創り出し、暗躍する悪徳な犯罪者に悪用されてしまう事態が起こりかねません。

この課題を乗り越えることができれば、AI は、人間の意思決定をサポートし、強化することが可能になります。それでも人間による指示は欠かせないはずですが、やがて AI が生成した提案を「承認する/却下する」だけになるかもしれません。法的な観点から、犯意責任の範囲については、人間が今までどおり意思決定を行うため、完全に混乱状態に陥ることはないと思います。専門性の高い特定の分野 (薬学やエンジニアリングなど) における狭い範囲での応用は現実的かもしれません。しかし、何かを選択するにあたって、常に法律上必須の判読しにくい条件がついてくるとなると、日々使用するユーザーは混乱するかもしれません。政策決定側の対応としては、不法行為/責任法、AI プロバイダー/ユーザー向けの強制保険、または新しいリスク管理システムを使用することで、テクノロジの完全な実用性を損なうことなく、AI 支援意思決定のデメリットを回避できる可能性があります。

人間による指示なしで意思決定が可能になったら、それはもう推量の領域になります。でも、思い出してください。金融市場では高頻度取引 (HFT) システムは既に存在しており、人間が直接管理することなくアルゴリズムの指示に従って稼働しているではありませんか。それでも、瞬間的暴落イベントと HFT の間には何か関係があるのではないかとささやかれていることからすれば、政策決定者や規制当局がさまざまな問題に直面しそうなことは予想できます。「制限付き」の AI であっても、AI が特定の状況でどう動作するかを予測するのは難しいかもしれません。また、システムが万が一制御不能になった場合 (狭義では、法的責任を負う人々がシステムを制御できなくなった場合、一般的な意味では、誰もシステムを制御できなくなった場合)、前述した事前の法的責任の制御では、政策決定者にとって不十分なものに感じられるかもしれません。

AI が人間の直接的責任から離れていくのに伴い、既存の法的プロセスが妥当性を失いつつある中、これら 3 つの段階は、政策決定者とって重大な課題が立ちはだかることを暗示しています。しかしながら、法は調整可能ですし、解決策が見つかる可能性もあります。例えば、極端な話、「法人」の概念を拡張して「AI 人」の概念を追加することを検討してもいいかもしれません。もしも、法的責任を AI 自身に負わせて、実際の企業と同様に訴えることができたとしたら、我々の安心感は増すでしょうか。可能性はあります。

要約すると、AI 開発の真の課題は、チェスの名人を打ち負かす貴重な瞬間や、チューリング テストに合格する貴重な瞬間にだけ存在するものではないということです。むしろ、いまだに 19 世紀/20 世紀の方向を向いている規制/政策処理システムが柔軟に機能できないため、今後もさまざまな障害が生じるでしょう。私のような技術者が言うとおかしく聞こえるかもしれませんが、AI の革新的な潜在能力を考慮すると、この遅れはよいことなのかもしれません。


Display Playing cards in WPF

$
0
0

I remember each version of Windows had new games. On Windows 3.1 many people played Solitaire. Windows for Workgroups 3.11 introduced Hearts. Somewhere along the way came FreeCell, MineSweeper and Spider Solitaire. This was years before more modern User interfaces, such as WPF existed.

Windows no longer ships Cards.dll which has the images of the cards and some routines that could be called to draw the cards using older techniques, such as Drawing Contexts and GDI.

Using more recent drawing techniques, such as Windows Presentation Foundation (WPF), and more modern languages, such as C# or VB.Net can make programming games easier.

Below is some sample code that extracts bitmaps from a the native Cards.Dll and converts it to a WPF BitmapSource which can be scaled and displayed as an Image in WPF. A timer shuffles the cards. You can click on the form to pause the shuffling.

You can use this code as a basis of a card game. Before writing the code, I was toying with the idea of drawing the cards on a WPF form using WPF Immediate Mode graphics or a combined Immediate and Retained mode graphic system, with HWndHost

Solitaire used a Dll in the WindowsSystem32 folder called Cards.dll. Because many of you may not have access to a Windows XP machine, I’ve made it available here.

You can open a native DLL, like Cards.dll directly in Visual Studio to examine and edit the resources in the resource editor: File->Open->File->Cards.dll

 

clip_image001

image

See also

https://blogs.msdn.microsoft.com/calvin_hsia/2004/08/13/displaying-playing-cards-for-bridge/

https://blogs.msdn.microsoft.com/calvin_hsia/2015/10/31/drawing-old-playing-card-images-for-bridge/

https://blogs.msdn.microsoft.com/calvin_hsia/2014/08/29/logo-turtle-graphics-in-wpf/

<code>

using System;
using System.IO;
using System.Runtime.InteropServices;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Interop;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Threading;

/*
 File->new->Project->C#->WPF App "CardDist"
 download cards.dll from https://onedrive.live.com/redir?resid=D69F3552CEFC21!74629&authkey=!AGaX84aRcmB1fB4&ithint=file%2cDll
 Solution->Add Existing Item Cards.dll (Properties: Copy to Output Directory=Copy If Newer)
 Add Project->Add Reference to System.Drawing
     *
     * * */
namespace CardDist
{
  /// <summary>
  /// Interaction logic for MainWindow.xaml
  /// </summary>
  public partial class MainWindow : Window
  {
    public MainWindow()
    {
      InitializeComponent();
      Width = 1100;
      Height = 800;
      Title = "CardDist";
      this.Loaded += MainWindow_Loaded;
    }

    private void MainWindow_Loaded(object sender, RoutedEventArgs e)
    {
      try
      {
        var sp = new StackPanel() { Orientation = Orientation.Vertical };
        sp.Children.Add(new Label() { Content = "Card Dealing Program. Click to toggle dealing" });
        var canvas = new Canvas();
        sp.Children.Add(canvas);
        this.Content = sp;
        var hghtCard = 100;
        var wdthCard = 80;
        foreach (Cards.Suit suit in Enum.GetValues(typeof(Cards.Suit)))
        {
          for (var denom = 0; denom < 13; denom++)
          {
            // create a new image for a card
            var img = new Image()
            {
              Source = Cards.GetCard(suit, denom),
              Height = hghtCard
            };
            // add it to the canvas
            canvas.Children.Add(img);
            // set it's position on the canvas
            Canvas.SetLeft(img, denom * wdthCard);
            Canvas.SetTop(img, (int)suit * hghtCard);
          }
        }
        for (int i = 0; i < Cards.NumCardBacks; i++)
        {
          var img = new Image()
          {
            Source = Cards.GetCardBack(i),
            Height = hghtCard
          };
          canvas.Children.Add(img);
          Canvas.SetTop(img, hghtCard * 5);
          Canvas.SetLeft(img, i * wdthCard);
        }
        var rand = new Random(1);
        var timer = new DispatcherTimer(
            TimeSpan.FromMilliseconds(40),
            DispatcherPriority.Normal,
            (o, args) =>
            {
              for (int n = 0; n < 52; n++)
              {
                        //get a random number 0-51
                        var tempNdx = rand.Next(52);
                        // exchange the Image.Source of the nth one with the tempNdx
                        // the child of a canvas is a UIElement, so we need to cast it to an Image
                        var tempSrc = ((Image)canvas.Children[tempNdx]).Source;
                ((Image)canvas.Children[tempNdx]).Source = ((Image)(canvas.Children[n])).Source;
                ((Image)canvas.Children[n]).Source = tempSrc;
              }
            },
            this.Dispatcher);
        this.MouseUp += (om, em) =>
        {
          timer.IsEnabled = !timer.IsEnabled;
        };
      }
      catch (Exception ex)
      {
        this.Content = ex.ToString();
      }
    }

    public class Cards
    {
      public enum Suit
      {
        Clubs = 0,
        Diamonds = 1,
        Hearts = 2,
        Spades = 3
      }
      // put cards in 2 d array, suit, rank (0-12 => 2-A)
      private BitmapSource[,] _bitmapCards;
      public BitmapSource[] _bitmapCardBacks;
      private static Cards _instance;

      public static int NumCardBacks => _instance._bitmapCardBacks.Length;

      public Cards()
      {
        _bitmapCards = new BitmapSource[4, 13];
        var hmodCards = LoadLibraryEx("cards.dll", IntPtr.Zero, LOAD_LIBRARY_AS_DATAFILE);
        if (hmodCards == IntPtr.Zero)
        {
          throw new FileNotFoundException("Couldn't find cards.dll");
        }
        // the cards are resources from 1 - 52.
        // here is a func to load an int rsrc and return it as a BitmapSource
        Func<int, BitmapSource> GetBmpSrc = (rsrc) =>
        {
                  // we first load the bitmap as a native resource, and get a ptr to it
                  var bmRsrc = LoadBitmap(hmodCards, rsrc);
                  // now we create a System.Drawing.Bitmap from the native bitmap
                  var bmp = System.Drawing.Bitmap.FromHbitmap(bmRsrc);
                  // we can now delete the LoadBitmap
                  DeleteObject(bmRsrc);
                  // now we get a handle to a GDI System.Drawing.Bitmap
                  var hbmp = bmp.GetHbitmap();
                  // we can create a WPF Bitmap source now
                  var bmpSrc = Imaging.CreateBitmapSourceFromHBitmap(
                      hbmp,
                      palette: IntPtr.Zero,
                      sourceRect: Int32Rect.Empty,
                      sizeOptions: BitmapSizeOptions.FromEmptyOptions());

                  // we're done with the GDI bmp
                  DeleteObject(hbmp);
          return bmpSrc;
        };
        // now we call our function for the cards and the backs
        for (Suit suit = Suit.Clubs; suit <= Suit.Spades; suit++)
        {
          for (int denom = 0; denom < 13; denom++)
          {
            // 0 -12 => 2,3,...j,q,k,a
            int ndx = 1 + 13 * (int)suit + (denom == 12 ? 0 : denom + 1);
            _bitmapCards[(int)suit, denom] = GetBmpSrc(ndx);
          }
        }
        //The card backs are from 53 - 65
        _bitmapCardBacks = new BitmapSource[65 - 53 + 1];
        for (int i = 53; i <= 65; i++)
        {
          _bitmapCardBacks[i - 53] = GetBmpSrc(i);
        }
      }

      /// <summary>
      /// Return a BitmapSource
      /// </summary>
      /// <param name="nSuit"></param>
      /// <param name="nDenom">1-13 = A, 2,3,4,J,Q,K</param>
      /// <returns></returns>
      public static BitmapSource GetCard(Suit nSuit, int nDenom)
      {
        if (_instance == null)
        {
          _instance = new Cards();
        }
        if (nDenom < 0 || nDenom > 12)
        {
          throw new ArgumentOutOfRangeException();
        }
        return _instance._bitmapCards[(int)nSuit, nDenom];
      }

      internal static ImageSource GetCardBack(int i)
      {
        return _instance._bitmapCardBacks[i];
      }
    }

    public const int LOAD_LIBRARY_AS_DATAFILE = 2;

    [DllImport("kernel32.dll", SetLastError = true)]
    static extern IntPtr LoadLibraryEx(string lpFileName, IntPtr hFileReserved, uint dwFlags);

    [DllImport("User32.dll")]
    public static extern IntPtr LoadBitmap(IntPtr hInstance, int uID);

    [DllImport("gdi32")]
    static extern int DeleteObject(IntPtr o);
  }
}

</code>

HoloLens Animations

$
0
0

This post is provided by App Dev Manager, Robert Schumann who explores the process of creating and animating avatars with HoloLens.


HoloLens Ambitious Beginnings

My first application on HoloLens involved animating a Marine Corps squad to do PT. Using voice commands, avatars would do sit-ups, push-ups, burpees, jumping jacks, etc. That was a year ago and a brutally naïve, overly ambitious learning curve to overcome. Since then HoloLens has its first “birthday”, seen huge interest and growth, and countless tutorials created. But few tutorials exist that provide straight-forward guidance about building animated apps for HoloLens.

This blog outlines steps I took to create, animate, and incorporate avatars into HoloLens. It involves the use of free, open source tools such as Adobe Fuse, Mixamo, Unity Editor, and Visual Studio. If you follow along and reach the end of this blog you should have a working HoloLens application of an avatar doing animation controlled by voice commands.

Here we go…

Setup

My setup is based on a Windows 10 Creators Update PC, so pictures and navigation may vary if you’re using a different platform and applications. Make sure your hardware supports the following soft resources needed for this blog;

  • Adobe Fuse (Beta)
  • Unity Editor 5.6
  • Visual Studio Community 2017
  • HoloToolkit-Unity from Github

You may also need online accounts from the following vendors;

  • Adobe Creative Cloud (https://accounts.adobe.com)
  • Unity (https://id.unity.com)
  • Microsoft (https://signup.live.com)

If not already proceed to sign-up, download, install, and setup above resources. Some setup notes are provided below.

Additionally, the following links provide overviews of important concepts for completing this blog. Of note is how to correct Mixamo character eyelashes when importing into Unity.

Unity Editor

downloadunityWhen installing Unity Editor, there’s an option to also download and install Visual Studio Community 2017 (VSC2017). If you don’t already have VSC2017 go ahead and check that box. Otherwise, default settings will suffice.

Spin up

If you’re new to any of the tools used in this blog take a few moments to at least get a familiar grasp of navigating around and identifying key functional aspects.

Adobe Fuse (Beta)

Help: https://helpx.adobe.com/beta/fuse/topics.html

creativecloudStart Creative Cloud (CC) making sure you log on to your Adobe account. At CC console, select the Apps tab. Make sure your Fuse installation is up-to-date.

Once logged on and up-to-date, click the Open button on the CC console. When Fuse starts, you’ll see a dark canvas center and body parts at right. Pretty weird uh? Ok, so go ahead and drag-n-drop body parts to center canvas; choose a head, torso, leg, and arm.

avatar

Click the Customize link located above the 3D model. In the right window pane, feel free to randomize attributes of each body section, or modify the attributes individually.

clip_image002Next click the Clothing link located above the 3D model, next to Customize. Again, in the right window pane, chose clothing for each body section of the 3D model.

Next click the Texture link located above the 3D model, next to Clothing. Click on the 3D model skin anywhere. Now in the right window pane options appear to change specific attributes of the 3D model. Things like eye and hair color, skin tone, smoothness and age, and teeth condition and coloring, etc. To change specific attributes about other aspects of the model like clothing color, make sure you’re in Texture mode and click on that part of the model you’d like to modify, options appear in right pane. Throughout all your changes for now leave model resolutions at default values.

If you’re wondering how to change the model pose from being something other than a T-pose don’t worry about that. Just use Fuse to model your avatar for now. We’ll use mixamo.com for rigging and animating the model next.

In Fuse save your work but don’t close anything, leave Fuse running and model ready.

BTW – if you’ve ever wanted an 3D avatar of yourself check-out the following link:
http://blog.mixamo.com/how-to-scan-yourself-into-a-game-ready-3d-model-using-kinect-and-fuse/

Mixamo

Help: https://community.mixamo.com/hc/en-us

Go to Mixamo.com and sign in with your Adobe account. Now go back to Fuse and located at the top right corner of the screen you should see the Send to Mixamo button. Click it. Give your model a name. Click Save. Fuse will now send your model over to Mixamo as an asset. Here’s where Fuse and Mixamo start to earn their keep.

autoriggerA new browser session should begin with your model loaded and ready for rigging. Rigging basically gives your model a skeleton by which movement can be applied, thus enabling animation. Fuse and Mixamo greatly simplify rigging, which can be very time consuming. On the Auto-Rigger screen, enable Facial Blendshapes, choose 65 joints for Skeleton LOD, then click Finish. (If you’re wondering why your model appears to have black-eyes this is “normal” and we’ll correct this later.)

A new window should load with your model in a T-pose. Again, hold off doing any animations. Click the Queue Download button. Then choose FBX for Unity (.Fbx) format and Original Pose (.Fbx) for Pose. Again, click Queue Download button. Once ready go ahead and download your new character. Ok. We just saved off our new character without any animations baked in, per se. Now let’s find animations we want to use.

(BEGIN REPEATABLE STEP)

Click Store link at top, then Animations link under the Search bar. Go ahead and peruse through the animations, but for this blog I’m going to use Standing Idle, Samba, Macarena, YMCA, and Salsa animations. Click an animation you’re interested in using. The screen should change and your character should display doing the animation selected. Click the “Add to Pack” button right of your character. Multiple animations can be added to an animation, but for demo purposes we’re going to add and save each animation individually. Since we’re going to use the same character it won’t necessarily matter that we’ll later apply different animations. If you were to use a different model/character, then you’d want to apply the animation to that character instead of trying to use the same animation across different models/characters. This is because animations as applied through Mixamo are based on the rigging of the character used. So if you have, for example, a one-arm Zombie doing the YMCA dance rigged to a two-arm character, the animation may not replay correctly.

Once an animation has been added to the pack, click the View/Download button. Click Queue Download button. Choose FBX for Unity (.Fbx), for Pose or Skin choose Without, Frames per second change to 60. Click Queue Download button and when ready download file. Repeat for each animation.

(END REPEATABLE STEP)

Alrighty; that should be it for Fuse and Mixamo. Before we switch gears to Unity, let’s do a quick assessment to make sure we’re ready to move ahead. By now you should have the Unity Editor installed. You should have the HoloToolkit for Unity downloaded. You should have an Fuse export of your character textures. And finally you should have at least 2 files downloaded from Mixamo– character.fbx file, and animation.fbx file(s). Additionally, if you haven’t done so already, you’ll need to make a Unity package of the HoloToolkit-Unity downloaded from Github. For instructions how see https://github.com/Microsoft/HoloToolkit-Unity/blob/master/GettingStarted.md.

Unity

clip_image002[6]Help: https://docs.unity3d.com/Manual/index.html

First time starting Unity you may get a prompt for firewall settings. Check all boxes to avoid complications with this blog.

At the start splash feel free to log onto your Unity account by clicking “My Account”, provide a project name, turn off Unity Analytics, and make sure 3D is checked.

clip_image004

When the new project begins delete the default camera from the Hierarchy tab. Import the HoloToolkit-Unity custom package, either choose defaults or at least the standard assets without any tests or examples. Now go ahead and replace the default camera with the HoloLensCamera prefab.

Once the toolkit is imported you’ll notice a new menu option called HoloToolkit. Under it are three (3) configuration options. From top-to-bottom choose and apply each one. For Capability Settings choose Microphone, Spatial Perception, and Internet Client.

Within the Project tab create four (4) new folders – _Animations, _Models, _Prefabs, and _Scripts.

maximocharLet’s add our Mixamo character by dragging and dropping the downloaded character .fbx file to the _Models folder. You should get prompted to fix textures, choose Fix Now. Also drag-n-drop the Fuse textures you previously exported into the _Models folder. You should see one Material object named something like *_Packed0_Diffuse. Select that one Material, in the Inspector window change the Rendering Mode to Opaque and scale down Smoothness completely. In the same folder (still _Models) as this one Material create a new Material, name it Hair. Make sure this new Hair Material is selected in the Project tab. Then, using the exported Fuse textures drag-n-drop the BaseColor texture next to Albedo, MetallicAndSmoothness texture next to Metallic, and Normal texture next to Normal Map. Change the Rendering Mode to Fade, and ensure Smoothness is scaled completely down.

We now need to tweak the model as exported from Mixamo. Within the _Models folder should be a prefab of your model, as represented by the blue icon cube. Select it, and in the Inspector tab select the Rig button. Change Animation Type to Humanoid. Ensure Avatar Definition is Create From This Model. Click the Configure button, saving first if prompted. What you should see now is the skin of your model with bones inside of it. If so, cool, we’re good so far. Under the Inspector tab, the Mapping button should be selected, and just above the Transform properties section, right-side should be a Done button. Click Done. Now drag-n-drop your model prefab onto the Hierarchy window, doesn’t matter where. Then select it from the Hierarchy and expand it exposing all child objects. Click each child object and find the eyelashes and hair objects. As you do change the default Material to the new Hair Material you previously created. Simply select the object then in the Inspector window drag-n-drop the new Hair Material over the existing one to swap out. If all is correct your model should look normal at this point. Once correct, drag-n-drop the root object from the Hierarchy window onto the Project _Prefabs folder in the Project Window. You’ve now just created a prefab and can delete it from the Hierarchy window.

BlogAvatarLet’s add the Mixamo animations by dragging and dropping each downloaded animation .fbx file to the _Animations folder. Again, you should get prompted to fix textures, choose Fix Now. As you do each one, before starting the next, select the animation prefab and in the Inspector window choose the Rig button. Change the Animation type to Humanoid, but for Avatar Definition this time choose Copy from Other Avatar. Underneath, for Source click the round icon to select an avatar. A new window should pop with just one avatar to choose which should be from the character model you previously imported. Select that avatar, close the select avatar window, click the Apply button in the Inspector. While still in the Inspector window click the Animations button next to the Rig button. Scroll down; check Loop Time, check Bake Into Pose for Root Transform Position (Y), and make sure the Based Upon (at Start) is set to Original. Click Apply, bottom right. Repeat these steps for each animation .fbx file you import.

animationstatesOnce your model and animations have been imported and staged, we now need to create an Animator Controller. Select the _Animations folder in the Project window and create a new Animation Controller, call it Dancing. If the Unity Editor is not displaying the Animator window, go to the Window menu and choose Animations. Once visible, dock it where ever you like. Then drag the newly created Dancing animation controller onto the Animator window. Now drag-n-drop each animation prefab from the _Animations folder onto the Animator window. As you do you should see block representing each animation with a name on it. Pay attention to the name and text case. Once you have all animations on the controller, right-click the green Entry block and choose Make Transition. You’ll now notice your cursor has an arrow string attached to it. Place your cursor over the standing idle animation and click to lock. You should now see a line from the Entry block connected to the idle animation block.

We’re ¾ done with the hard parts taken care of. And are at the point of assembling everything together.

heirarchyGo back to the _Prefabs folder and select your character prefab. In the Inspector window, for the Animator properties drag-n-drop the Dancing animation controller you just created from the _Animations folder to the Animator Controller property. For the Animator Avatar, again click the round select icon next to the property field and in the pop-up choose your character avatar. Make sure the Apply Root Motion checkbox is checked.

In the Hierarchy window add two HoloToolkit prefabs – BasicCursor and InputManager. While still in the Hierarchy window select the HoloLensCamera and add 3 components to this object – Spatial Mapping Collider, Spatial Mapping Renderer, and New C Sharp script. Name the new C# script Manager. By default, the new script is created in the root Assets folder. Drag and drop the Manager script into the _Scripts folder, then double-click it to open in code editor. Replace the existing code with sample code provided below. Once saved go ahead and Build & Run the application from Unity Editor. When Unity finishes building the application and creating a Visual Studio solution deploy the application to your HoloLens device. When running the application, you should see the usual Made with Unity logo then a brief 3-5 second period of nothingness. At this point the spatial mapping functions of the application are beginning to scan the room. As orientation is established you should start to see wire mesh of room surfaces. At this point you should also see a gaze cursor. Find an open space on the floor and air tap. The avatar character should now be standing idle in that spot on the floor and facing you. You can say four voice commands to make the avatar animate. Say “Macarena Dance”, “Samba Dance”, “Salsa Dance”, or “YMCA Dance” to make the avatar dance. Say “Idle Dance” to make the avatar stand idle again.

Enjoy!









using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;

using System.Linq;
using UnityEngine.VR.WSA.Input;
using UnityEngine.Windows.Speech;
using HoloToolkit.Unity;
using HoloToolkit.Unity.InputModule;

public class Manager : MonoBehaviour {

    public GameObject DancerPrefab;
     private Animator anim;
     private Boolean dancerExist = false;
     private GestureRecognizer gestureRecognizer;
     private KeywordRecognizer keywordRecognizer;

    delegate void KeywordAction(PhraseRecognizedEventArgs args);
     private Dictionary<string, KeywordAction> keywordCollection;

    private void Start()
     {
         gestureRecognizer = new GestureRecognizer();
         gestureRecognizer.SetRecognizableGestures(GestureSettings.Tap);
         gestureRecognizer.TappedEvent += Recognizer_TappedEvent;
         gestureRecognizer.StartCapturingGestures();

        keywordCollection = new Dictionary<string, KeywordAction>();

        keywordCollection.Add("Samba Dance", SambaDanceCommand);
         keywordCollection.Add("Salsa Dance", SalsaDanceCommand);
         keywordCollection.Add("YMCA Dance", YMCADanceCommand);
         keywordCollection.Add("Macarena Dance", MacarenaDanceCommand);
         keywordCollection.Add("Idle Dance", IdleDanceCommand);

        keywordRecognizer = new KeywordRecognizer(keywordCollection.Keys.ToArray());
         keywordRecognizer.OnPhraseRecognized += KeywordRecognizer_OnPhraseRecognized;
         keywordRecognizer.Start();
     }

    private void KeywordRecognizer_OnPhraseRecognized(PhraseRecognizedEventArgs args)
     {
         KeywordAction keywordAction;

        if (keywordCollection.TryGetValue(args.text, out keywordAction))
         {
             keywordAction.Invoke(args);
         }
     }

    private void Recognizer_TappedEvent(InteractionSourceKind source, int tapCount, Ray headRay)
     {
         RaycastHit hitInfo;

        if (!dancerExist && Physics.Raycast(Camera.main.transform.position, Camera.main.transform.forward, out hitInfo, Mathf.Infinity))
         {                            
             GameObject dancer = Instantiate(DancerPrefab, hitInfo.point, Quaternion.Euler(0, transform.eulerAngles.y + 180f, 0));
             anim = dancer.GetComponent<Animator>();
             dancerExist = true;

            Camera.main.gameObject.GetComponent<UnityEngine.VR.WSA.SpatialMappingRenderer>().enabled = false;
         }
     }

    private void MacarenaDanceCommand(PhraseRecognizedEventArgs args)
     {
         anim.Play("macarena_dance", -1, 0f);
     }

    private void YMCADanceCommand(PhraseRecognizedEventArgs args)
     {
         anim.Play("ymca_dance", -1, 0f);
     }

    private void SalsaDanceCommand(PhraseRecognizedEventArgs args)
     {
         anim.Play("salsa_dance", -1, 0f);
     }

    private void SambaDanceCommand(PhraseRecognizedEventArgs args)
     {
         anim.Play("samba_dance", -1, 0f);
     }

    private void IdleDanceCommand(PhraseRecognizedEventArgs args)
     {
         anim.Play("standing_idle", -1, 0f);
     }

    private void OnDestroy()
     {
         gestureRecognizer.TappedEvent -= Recognizer_TappedEvent;
         keywordRecognizer.OnPhraseRecognized -= KeywordRecognizer_OnPhraseRecognized;
     }
}


Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.

Skype for Business 2016 でマイクを接続せずに会議に参加する

$
0
0

こんにちは、 Japan Skype/Teams サポートの吉野です。
今回は新機能として追加された「マイクを接続せずに会議に参加する」をご紹介します。

もともと、 Skype for Business は「会議」を想定していたため、音声会議を実施するためにはマイクとスピーカー(もしくはヘッドセット等音声を「聞く」デバイスと「話す」デバイス)が必須でした。

ところが、発表者が一方的にしゃべる内容を参加者が聞くだけという会議も非常に多いことが分かりました。
また Skype Broadcast という会議形式も提供しております。

そのような中でイヤホンだけつなげて会議を聞きたいという要望がかなりあることが分かってきました。

そこで、マイクを繋げなくても音声会議に参加できるようになりました。
イヤホンだけで会議に参加すると以下のように表示されますが、会議から追い出されずに参加し続けることができます。

特に設定などは必要ありませんのでアップデートしていただければそのままご利用いただけます。

※この機能はC2R版 (Office 365 や MSDN )で提供されております。
MSI版(ボリュームライセンス)につきましては少々お待ちください。

June 2017 App Service update

$
0
0

This month we shipped improvements to the overview blade, a new unified create experience for both Windows and Linux based apps as well a new recommendation history UX.

New App Service Overview blade

The overview blade for Web, Mobile and API apps has been overhauled with new charting controls and performance improvements, this should make browsing through your apps faster. The new charts integrate nicely with Azure Monitor and are perfect for pinning into custom dashboards.

image

Integrated create experience for Windows and Linux based apps

With this update you can now choose the OS of the App Service plan used to host your app. Learn more about Web App on Linux.

image

App Service Advisor recommendation history

App Service Advisor provides proactive recommendations on how to solve problems with your application. We have revamped this UI to also include a history of the recommendations that have triggered in the past.

image

If you have any questions about any of this features or App Service in general be sure to check our forums in MSDN and Stack Overflow.

For any feature requests or ideas check out our User Voice

Loan Chargeoff Prediction Solution Templates are Live in Cortana Intelligence Solutions Gallery

$
0
0

A charged off loan is a loan that is declared by a creditor (usually a lending institution) that an amount of debt is unlikely to be collected, usually when the loan repayment is severely delinquent by the debtor. Chargeoff loan has huge financial implication on lending institutions’ financial statement.  No banks or credit unions like to have high percentage of chargeoff loans in their books. Having data ahead of time of which loans are most likely to be charged off will buy tremendous lead time for the banks to save the loans from charging off. Loan manager could formulate personalized plan with the borrower on how to continue with loan repayment rather than incurring the loan as a chargeoff.

For more information, read this blog: End to End Loan ChargeOff Prediction Built Using Azure HDInsight Spark Clusters and SQL Server 2016 R Service

We have published two solution templates deployable using two technology stacks for the above chargeoff scenario:-

  1. Loan Chargeoff Prediction using SQL Server 2016 R Services – Using DSVM with SQL Server 2016 and Microsoft ML, this solution template walks through how to create and clean up a set of simulated data, use 5 different models to train, select the best performant model, perform scoring using the model and save the prediction results back to SQL Server. A PowerBI report connects to the prediction table and show interactive reports with the user on the chargeoff prediction.
  2. Loan Chargeoff Prediction using HDInsight Spark Clusters – This solution demonstrates how to develop machine learning models for predicting loan chargeoff (including data processing, feature engineering, training and evaluating models), deploy the models as a web service (on the edge node) and consume the web service remotely with Microsoft R Server on Azure HDInsight Spark clusters. The final predictions is saved to a Hive table which could be visualized in Power BI.

With just a few clicks to deploy the solution to their Azure subscriptions, external customers can explore the raw data, understand the data transformation, R model engineering and operationalization of the models. PowerBI report also allows the customers to explore the chargeoff historical and prediction data. Internal Microsoft sales and field team could also quick deploy the solution and demo to the customers on the end to end data pipeline and R model development.

We have also published the same solutions to GitHub, please play around with the solution templates and provide your feedback, here are some of the link on the blog and GitHub repo:-

Blog:- https://blogs.msdn.microsoft.com/rserver/2017/06/29/end-to-end-loan-chargeoff-prediction-built-using-azure-hdinsight-spark-clusters-and-sql-server-2016-r-service/

GitHub:- https://microsoft.github.io/r-server-loan-chargeoff/index.html

Solution Template in Cortana Intelligence Gallery:- https://gallery.cortanaintelligence.com/Solution/Loan-ChargeOff-Prediction-with-SQL-Server

https://gallery.cortanaintelligence.com/Solution/Loan-ChargeOff-Prediction-with-Azure-HDInsight-Spark-Clusters

 

Ajay Jagannathan (@ajaymsft)

Principal Program Manager

Viewing all 5308 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>