Quantcast
Channel: MSDN Blogs
Viewing all 5308 articles
Browse latest View live

Bot Framework と Microsoft Graph で DevOps その 22 : ダイレクトライン

$
0
0

今回はダイレクトラインについて。一応これでシリーズは終わりにします。まだまだネタはありますが、きりがない。。

概要

ボットコネクターは Skype、Facebook、電子メールなど複数のチャネルとの接続を提供しますが、LINE や独自アプリなどネイティブにサポートされていないアプリを接続する仕組みとして、REST 形式でアクセスできるダイレクトラインが提供されます。DevOps 観点でこのダイレクトラインが重要なのは、ファンクションテストで使っているからです。

機能

以下のことが出来ます。API 詳細はこちら

  • REST API に対する認証
  • 会話の開始と終了
  • メッセージの送受信

また接続方法は通常の HTTP 要求またはWebSocket が利用できます。

制限

WebSocket を使わない場合、ボットアプリからの通知をリアルタイムに順次受けることはできません。動作としては以下のようになります。

1. 会話を開始して、メッセージを送信。

2. ボットアプリ内ですべての処理が終わった時点でメッセージ送信処理が完了とみなされる。処理内でボットアプリからメッセージを返信していても、この時点では取得できない。

3. メッセージの受信を要求。キューされたすべてのメッセージを受信する。尚、自分が送信したメッセージも同時に受信。

差分受信

同一会話内でメッセージを送受信する場合は、Watermark と呼ばれる値をリクエストに付与することで、前回受信した以降のメッセージだけを受信可能。

ファンクションテスト

DirectLineHelper.cs でダイレクトラインの機能を使っています。会話の開始とメッセージの送受信だけを行う、非常にシンプルな仕組みです。

WebSocket での利用

ファンクションテストは通常の HTTP 要求しか使っていないため、WebSocket のサンプルを作ってみましょう。

1. ボットアプリソリューションに新しくコンソールアプリプロジェクトを追加。O365Bot.StreamClient としました。

2. NuGet より DirectLine と WebSocketSharpモジュールを追加。また System.Configuration を参照に追加。

image

image

3. App.config に DirectLine と UserId のキーを設定。

image

4. Program.cs を以下のコードで差し替え。

using Microsoft.Bot.Connector.DirectLine;
using Newtonsoft.Json;
using System;
using System.Configuration;
using System.Linq;
using System.Threading.Tasks;
using WebSocketSharp;

namespace O365Bot.StreamClient
{
    class Program
    {
        static string userId = ConfigurationManager.AppSettings["UserId"];
        static void Main(string[] args)
        {
            Run().Wait();
        }

        private static async Task Run()
        {
         
            var client = new DirectLineClient(ConfigurationManager.AppSettings["DirectLineSecret"]);
            // 会話の作成
            var conversation = await client.Conversations.StartConversationAsync();
            using (var webSocketClient = new WebSocket(conversation.StreamUrl))
            {
                // メッセージ受信時の処理
                webSocketClient.OnMessage += WebSocketClient_OnMessage;
                webSocketClient.Connect();

                while(true)
                {
                    var input = Console.ReadLine();
                    if (input == ConsoleKey.Enter.ToString())
                        break;

                    Activity activity = new Activity()
                    {
                        From = new ChannelAccount(userId, userId),
                        Text = input,
                        Locale = "en-US",
                        Type = ActivityTypes.Message
                    };

                    await client.Conversations.PostActivityAsync(conversation.ConversationId, activity);
                }
            }
        }

        private static void WebSocketClient_OnMessage(object sender, MessageEventArgs e)
        {
            if (string.IsNullOrWhiteSpace(e.Data))
                return;
            var activitySet = JsonConvert.DeserializeObject<ActivitySet>(e.Data);
       
            foreach (var activity in activitySet.Activities.Where(x=>x.From.Id != userId))
            {
                // メッセージの処理
                Console.WriteLine(activity.Text);
            }
        }
    }
}

テスト実行

アプリを実行してテキストを入力。結果を確認。

image

基本的な動作は同じように見えますが、メッセージがリアルタイムに返ってくるようになっています。

まとめ

ダイレクトラインは簡単に利用できる API ですが、応答のタイミングや WebSocket サポートなど知っておくとより便利に使えます。

今回で Bot Framework で DevOps シリーズは一旦終わりにしますが、少しでも雰囲気をつかんでもらえると嬉しいです。


Count down to World Wide finals of Imagine Cup

$
0
0
ScottGu

Scott Guthrie, Microsoft’s executive vice president of Cloud and Enterprise, will host the Imagine Cup Championships, which you can watch via a special webcast at 9 a.m. PDT on Thursday, July 27.

Interested in learning more about the Imagine Cup

UK Imagine Cup 2017 Finalist Video

Microsoft Helping Students to Start-ups taking their ideas to the next stage.

On-premise TFS to VSTS Migration Whitepaper

$
0
0

 

If your company/Team is planning to migrate on-Prem Team foundation Server (TFS) collection(s) to Visual Studio Team Services (VSTS), whitepaper (link below) with this post can provide detailed information about the process & migration. White paper provides detailed information with screenshots to migrate a TFS 2008 based instance to VSTS.

 

TFS-to-VSTS-Migration_WhitePaper

TFS-to-VSTS-Migration_WhitePaper_MSWord2003

 

 

 

 

Microsoft Tech day no Cubo – Evento totalmente gratuito!

$
0
0

Dia 21 de Junho acontecerá o Microsoft Tech Day, um evento onde um grupo de MVPs irá mostrar para vocês como a Microsoft pode tornar sua ideia em um empreendimento único utilizando as tecnologias mais modernas que podemos oferecer. Se inscreva nesse evento totalmente gratuito e aprenda com especialistas que são referencias no mercado!

 

Cadastre-se agora!

 

Nos vemos lá!

Application Insights – Advisory 06/20

$
0
0
We are working on switching our email delivery system for all Application Insights related emails. As part of this change, Application Insights emails will be delivered from the following email address: no-reply@applicationinsights.io, instead of ai-noreply@microsoft.com. Starting from 6/26/2017 20:00 UTC, all Application Insights emails will be sent from no-reply@applicationinsights.io. Customers might also notice small cosmetic changes.

Please refer to the following documentation for any addition details/information regarding Application Insights: https://docs.microsoft.com/en-us/azure/application-insights/

-Vitaliy

Skype for Business iOS/Android –ゲスト参加時に表示されるメッセージ

$
0
0

Japan Skype/Lync サポート チームです。

サインイン中のモバイルクライアントで他組織のオンライン会議へゲスト参加する際、会議 URL へアクセスした後に以下のメッセージが表示されます。

  • Skype for Business for iOS
       
  • Skype for Business for Android
        
  • メッセージ
    日本語 : 「会議専用バージョンのアプリで参加しています。終了後、正規のアプリにサインインし直します。」
    英語 : 「You’re joining using a meetings-only version of the app. We’ll sign you back in to the regular app when you’re done.」

モバイルクライアントがゲスト参加する際、過去の実装では一旦サインアウトした状態で会議 URL へアクセスする必要がありました。ユーザーのオペレーションが分かりにくいことから、サインインした状態からでもゲスト参加可能なように実装が変更されましたが、内部的な処理としてはサインアウトする必要があるります。
一旦サインアウト状態となることより、該当のメッセージを表示してユーザーへの許可を求める実装となっております。不正な会議へ参加しているなど、異常な動作となっている訳ではございません。なお、アカウントを持たないユーザーなどのゲスト参加シナリオにつきましては、引き続き利用可能となっております。

本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。

Create Bot for Microsoft Graph with DevOps 1: Unit Testing Bot Framework application

$
0
0

In this series, I will develop a chat bot by using Microsoft Bot Framework and Graph API with VSTS to enable DevOps.

Unit Tests for Bot Framework

Testing is one of the key concept to make DevOps successful, but unit testing of Bot Framework is different from other projects such as MVC. The article below explain the concept very well, so I won’t duplicate the effort to explain it here.

Unit Testing for Bot Applications

https://www.microsoft.com/reallifecode/2017/01/20/unit-testing-for-bot-applications/
Thanks to Catalyst team and Mor Shemesh who shared their precious knowledge and allow me to re-use your contents.

Development Environment

VSTS project

VSTS (Visual Studio Team Services) is Microsoft tool which I use for this project. If you don’t have VSTS account, signup the trial.

1. Create new project in VSTS.

image

2. I specified Git as version control and Agile for template.

image

3. Once project setup, click [Clone in Visual Studio]

image

4. Map to local folder.

image

Visual Studio Project

1. Create new solution in Team Explorer

image

2. Select Bot Application template. I name the project as O365 Bot.

image

3. Add Unit Test Project, too. I name it  as O365Bot.UnitTests.

image

Create unit test helper

BotBuilder github has great resource for unit testing.

1. Download source code from BotBuilder GitHub (https://github.com/Microsoft/BotBuilder)

2. Add a Helper folder in the Unit Test project.

3. Copy below files into the helper folder.

– BotBuilder-masterCSharpTestsMicrosoft.Bot.Builder.TestsConversationTest.cs
– BotBuilder-masterCSharpTestsMicrosoft.Bot.Builder.TestsDialogTestBase.cs
– BotBuilder-masterCSharpTestsMicrosoft.Bot.Builder.TestsFiberTests.cs

As these files also include test class, delete followings.

– ConversationTest class in ConversationTest.cs
– FiberTests class in FiberTest.cs

4. Add following NuGet packages and apply latest versions for both bot and unit test projects.

– BotBuilder
– Moq

* I didn’t update MSTest packages as the latest version didn’t work well. The version I use is 1.1.11

5. Add System.Web reference and O365Bot project reference.

6. Build the solution once.

Add unit test and execute

All the prerequisites are ready. Let’s add test.

1. Replace the code in UnitTest1.cs with following.

2.Build the solution.

3. Run the test from Text Explore.

image

Summery

I will add AuthBot to enable ADAL authentication for Office 365 in the next article. Don’t forget to Check-in the code to VSTS.

The code is hosted at my GitHub

Ken

ソーシャル エンジニアリングは組織にとって最大の脅威となるか

$
0
0

2017 年 4 19 Microsoft Secure Blog スタッフ – マイクロソフト

このポストは「 Is social engineering the biggest threat to your organaization? 」の翻訳です。


” 素人はシステムをハッキングし、プロは人をハッキングするということを常に念頭に置くべきだ ”   Counterpane Internet Security, Inc. CTO、Bruce Schneier 氏


世界中でソーシャル エンジニアリングが広まり、組織のセキュリティへの脅威が高まっています。2015 1 月以降、FBI が特定したソーシャル エンジニアリングの被害者数は 270% 増加し、企業の損失額は 23 億ドルを超えるまでになりました。

ソーシャル エンジニアリングは、ハッカーが操作、誘導、詐欺などの手口で他人に情報を公開させたり、何らかの行為を実行させたりして、利益を得ようとするものです。基本的には、人をだまして通常のセキュリティ手順を破るように仕向けます (パスワードを漏らすなど)

以下は一般的な種類のソーシャル エンジニアリングの例です。

  • スピア フィッシング – CEO や社内 IT 部門など、ユーザーが信頼する人物を装って電子メールを送信し、機密情報が脆弱な状態になるような行為を要求します。
  • ダンプスター ダイビング – ごみ箱をあさり、IP 情報が含まれる設計文書や、マーケティング計画、従業員業績計画、さらには、組織図や電話番号リストといった機密情報を見つけようとします。
  • 10 Degrees of Separation (10 度の分離) – ユーザーが信頼する共通の人脈があるように見せかけ、機密情報を話すことに対するユーザーの安心感を高めます。

技術的にセキュリティがいかに強力であっても、組織のスタッフが最も大きな弱点となってしまうのが一般的です。しかし、よく練られた定期的な教育を徹底して行えば、ソーシャル エンジニアリングとの闘いにおいてスタッフが最大の資産になる可能性もあります。

強力なセキュリティ ポリシーを実装し、セキュリティ意識の高い文化を築いて、組織をソーシャル エンジニアリングのリスクから守る方法については、内部関係者向けソーシャル エンジニアリング対応ガイドをご確認ください。

 

 


Log results from Azure Automation Runbooks directly to OMS

$
0
0

An easy way to keep track of Azure Automation runbook operations is to create log entries to OMS during execution.
https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-overview

There is a Rest API service as part of OMS that let’s you post log entries directly into your OMS Workspace, allowing for easy searching and alerting of anything you’re runbooks do.  There are some code samples provided, in the link above, that allow you to build your own functions to use this API, or you can use the hard work someone else has done and import the OMSIngestionAPI module, which is what I have done.

Before you start, you will need to have your OMS Workspace Name, ID, and key available.  You can get these from the OMS Workspace settings page.
I recommend using (and will use in this post) Azure KeyVault to hold your OMS Workspace Key securely and allow for any key rotation or changes in a programmatic way.  Have a look at Getting Started – Azure KeyVault to get one set up if you haven’t got one already.

Here’s the code I used to get things setup and ready for use.
This creates Automation variables for OMSWorkspaceName, OMSWorkspaceID and KeyVaultName.
Also created is a KeyVault secret for the OMS Workspace Key, granting the Automation RunAs account rights to get secrets from your KeyVault.

$AutomationAccountName = "{My Automation Account Name}"
$ResourceGroupName = "{My Automation Account Resource Group Name}"
$KeyVaultName = "{My KeyVault Name}"
$KeyVaultResourceGroupName = "{My KeyVault Resource Group Name}"
$OMSWorkspaceName = "{My OMS Workspace Name}"
$OMSWorkspaceID = "{My OMS Workspace ID}"
$OMSWorkspaceKey_Raw = "{My OMS Workspace Key}"
$OMSWorkspaceKey = convertto-securestring ($OMSWorkspaceKey_Raw)  -asplaintext -force

New-AzureRmAutomationVariable -Name "OMSWorkspaceName" -Value $OMSWorkspaceName -AutomationAccountName $AutomationAccountName -ResourceGroupName $ResourceGroupName -Encrypted $false
New-AzureRmAutomationVariable -Name "OMSWorkspaceID" -Value $OMSWorkspaceID -AutomationAccountName $AutomationAccountName -ResourceGroupName $ResourceGroupName -Encrypted $false
New-AzureRmAutomationVariable -Name "KeyVaultName" -Value $KeyVaultName -AutomationAccountName $AutomationAccountName -ResourceGroupName $ResourceGroupName -Encrypted $false
Set-AzureKeyVaultSecret -VaultName $KeyVaultName -Name "OMSWorkspaceKey-$($OMSWorkspaceName)" -SecretValue $OMSWorkspaceKey

$RunAsConn = Get-AzureRmAutomationConnection -Name 'AzureRunAsConnection' -ResourceGroupName $ResourceGroupName -AutomationAccountName $AutomationAccountName
Set-AzureRmKeyVaultAccessPolicy -VaultName $KeyVaultName -ServicePrincipalName $RunAsConn.FieldDefinitionValues.ApplicationId -ResourceGroupName $KeyVaultResourceGroupName -PermissionsToSecrets get

Now, to log information to OMS you need to pass a json payload and a LogType.  I have some script that I add to the top of all my scripts to do this.
You will notice I’ve include various data types so as to ensure the log data is formatted correctly, I’ve borrowed this from the Rest API reference, so please look the ‘Record type and properties’ section for all the details.
For example, if it’s a number the use _d (double) at the end of the parameter so no quotes are placed around the value.

#Retrive the RunAs Connection details
$Conn = Get-AutomationConnection -Name 'AzureRunAsConnection'

#Login with the RunAs Account
Add-AzureRMAccount -ServicePrincipal -Tenant $Conn.TenantID `
    -ApplicationId $Conn.ApplicationID `
    -CertificateThumbprint $Conn.CertificateThumbprint `
    -SubscriptionId $Conn.SubscriptionId

#Retrieve KeyVault Name to get secure variables (Namely, the OMS Workspace Key)
$KeyVaultName = Get-AutomationVariable -Name "KeyVaultName"

#Get OMS Workspace details
$OMSWorkspaceID = $null
$OMSWorkspaceName = Get-AutomationVariable -Name "OMSWorkspaceName"
$OMSWorkspaceID = Get-AutomationVariable -Name "OMSWorkspaceID"
if ($OMSWorkspaceID -ne $null) {
    $OMSKey = (Get-AzureKeyVaultSecret -VaultName $KeyVaultName -Name "OMSWorkspaceKey-$($OMSWorkspaceName)").SecretValueText
    $LogType = "Runbook_AutoshutdownFinal"
} else {
    $OMSKey = $null
    $LogType = $null
}

#Set the LogType to use when posting to OMS
$LogType = "OMSLogTest"

function AddToLog ([Object]$OMSLogData) {

    if ($OMSWorkspaceID -ne $null) {
        $LogData = ''
        foreach ($Key in $OMSLogData.Keys) {
            switch ($Key.Substring($Key.Length-2)) {
                '_s' {$sep = '"';$trim=$Key.Length-2}
                '_t' {$sep = '"';$trim=$Key.Length-2}
                '_b' {$sep = '';$trim=$Key.Length-2}
                '_d' {$sep = '';$trim=$Key.Length-2}
                '_g' {$sep = '"';$trim=$Key.Length-2}
                default {$sep = '"';$trim=$Key.Length}
            }
            $LogData = $LogData + '"' + $Key.Substring(0,$trim) + '":' + $sep + $OMSLogData.Item($Key) + $sep + ','
        }
        $TimeStamp = Get-Date (Get-Date).ToUniversalTime()
        $LogData = $LogData + '"TimeStamp":"' + $timestamp + '"'

        Write-Verbose "LogData: $($LogData)"
        $json = "{$($LogData)}"

        $PostResult = Send-OMSAPIIngestionFile -customerId $OMSWorkspaceID -sharedKey $OMSKey -body $json -logType $LogType -TimeStampField "TimeStamp"
        Write-Verbose "PostResult: $($PostResult)"
        if ($PostResult -ne "Accepted") {
            Write-Error "Error posting to OMS - $PostResult"
        }
    }
}

So whenever you want to log anything you would build a hashtable of the data you want to log and pass that to the AddToLog function.
For example, if you were to log information from a runbook that starts and stops VMs you would use something like this –

$VMName = "MyVM1"
$ResourceGroupName = "MyResourceGroup"
$SubscriptionID = "7d94a8a1-12c4-1234-aaaa-xxxxxxxxxxxx"
$VMState = Get-AzureRmVM -Name $VMName -ResourceGroupName $ResourceGroupName -Status
if ($VMState.Statuses[1].DisplayStatus -eq "VM running") {
    Stop-AzureRmVM -Name $VMName -ResourceGroupName $ResourceGroupName -Force
    $Action = "Stop VM"
} else {
    $Action = "VM Already Stopped"
}

$LogData = @{
    Trigger_s = "Scheduled"
    VMName_s = $VMName
    SubscriptionGUID_g = $SubscriptionId
    ResourceGroup_s = $ResourceGroupName
    TargetState_s = "VM dealloacted"
    CurrentSatet_s = $($VMState.Statuses[1].DisplayStatus)
    Action_s = $Action
}
AddToLog -OMSLogData $LogData

Which will look something like this in the OMS logs

SharePoint 2013 June 2017 CU

$
0
0

Our SharePoint product group released the next monthly cumulative updates. How patching works for SharePoint 2013? Read more in the post from my colleague Stefan.

New since April 2015:

  • You need to have SP1 installed!
  • In case you have installed a SP1 slipstream version, please read the article from my colleague Stefan!!!
  • Search enabled Farm, please check as well this article.

The format how we are providing the Cumulative Update packages has changed in 2015, means you can find all packages on our Download Center.

When you reached the download website, it is independent which language you chose, the packages are the same.

Plan to upgrade carefully, you may need more time in case the current patch level for your Farm is June 2015 or earlier! Reason is in regards to Search and psconfig. Please check the post from my colleague Stefan!

SharePoint Foundation 2013
3203428 The full server package for SharePoint Foundation 2013
https://support.microsoft.com/en-us/help/3203428

Download link: https://www.microsoft.com/en-us/download/details.aspx?id=55349

SharePoint Server 2013
3203430 The full server package for SharePoint Server 2013 and contains also the SharePoint Foundation 2013 fixes so you need only this package.
https://support.microsoft.com/en-us/help/3203430

Download link: https://www.microsoft.com/en-us/download/details.aspx?id=55362

Office Web Apps Server 2013
3203391 The full server package for OWAS 2013
https://support.microsoft.com/en-us/help/3203391

Download link: https://www.microsoft.com/en-us/download/details.aspx?id=55360

Project Server 2013
3203429 The full server package for Project Server 2013 and contains also the SharePoint Server and Foundation 2013 fixes so you need only this package.
https://support.microsoft.com/en-us/help/3203429

Download link: https://www.microsoft.com/en-us/download/details.aspx?id=55336

Important for all Server Applications listed above:
After any binary update you need to run psconfig or psconfigui. Please refer to the following article to find out what is the best for you: why-i-prefer-psconfigui-exe-over-psconfig-exe.aspx

You might have you own strategy to run psconfig, because it depends on the farm structure and what makes sense to reduce the downtime.

Regarding PSConfig: With August 2016 CU for SharePoint Server 2016 we updated psconfig with a couple of improvements, discussed in my colleagues blog post:

The news are that these improvements are now available with this December update also for SharePoint Server 2013.

As soon as possible (in case patch level of your Farm is June 2015 or earlier) after you installed the binaries, run the psconfig! Reasons are here.

Related Info:

Update Center for Microsoft Office, Office Servers, and Related Products

October 2015 Office Update Release by sustain engineering team
Common Question: What is the difference between a PU, a CU and a COD?

How to: install update packages on a SharePoint 2013 farm where search component and high availability search topologies are enabled

CHANGE: SharePoint 2013 Rollup Update for the December 2013 Cumulative Update Packaging

SQL Server 2014 and SharePoint Server 2013

Power BI Desktop 2017年6月の更新

$
0
0

Microsoft Japan Data Platform Tech Sales Team 伊藤

Power BI には Power BI Service と Power BI Desktop があり、Power BI Desktop の今回 (2017年6月) の更新では次の機能が追加・強化されました。

※ Power BI Report Server 用の Power BI Desktop には当てはまりません。

レポート

データ接続

クエリの編集

表現力におけるポイントとしては、データバーでしょう。テーブルとマトリックスは Excel の表現力をどんどん取り込んでいます。下線を引いた項目は、サンプルレポートに入れているので、実物をご覧になって確認いただければと思います。基本的に、各ビジュアル (視覚化) を選択し「書式」ボタン (ペンキを塗る道具のマーク) を開くと設定できるようになっていますが、「データ バー」についてはメジャーに対する条件付き書式の設定であり、[値] に配置した項目の右クリックメニューから設定するのがポイントです。

image

 

今回は見出しだけでは意味が分からない「高密度サンプリング」と「カスタム データ コネクタ」にフォーカスして説明します。

高密度サンプリング

Power BI のグラフでは表示するデータポイントの数を制限しています。制限を超えている場合、そのビジュアルの左上に ⓘ マークが表示され「値が多すぎます。一部のデータは表示されていません。(後略)」というメッセージを確認することができます。グラフの種類やどこに配置するかによって制限は異なります。(描画パフォーマンスを考慮しての仕様と思われますが、表示数の制限に関する詳細なドキュメントは公開されていません。) この「一部のデータを表示しない」ためにデータをサンプリングする方法が変更になったというわけです。
お客様からのフィードバックに基づき、折れ線グラフと面グラフのサンプリング アルゴリズムを更新しました。 新しいアルゴリズムは、データの形状をよりよく保ち、外れ値を浮かび上がらせます。 この新しいアルゴリズムでは、軸に沿ってビンが作成され、ビン内の各ラインの最小値と最大値が保持されます。 この変更により、データポイントの制限を超えている折れ線グラフでの操作性が大幅に向上するはずです。 このサンプリング アルゴリズムにより、パフォーマンスとロード時間を犠牲にすることなく、多数のデータポイントを表現することができます。 以前のサンプリングアルゴリズムを使用する場合は、[書式] パネルの [全般] で「高密度サンプリング」をオフに戻すことができます。

「高密度サンプリング」を オン にした折れ線グラフ

image

 

「高密度サンプリング」を オフ にした折れ線グラフ

image

2 つのグラフを見比べてみると「高密度サンプリング」を オン にした上の方がより特徴をとらえたグラフとなっているかと思います。

アルゴリズムの詳細については、おそらくこれからドキュメントが公開されると思われます。Microsoft はこの分野への投資を継続し、他のビジュアルも改善する予定ですので、ぜひフィードバックをお願いします。

 

カスタム データ コネクタ

2017年5月に Data Connectors SDK Developer Preview を発表しました 。 この新しい SDK により、開発者は Power BI に簡単に接続できる独自のデータ  コネクタを作成できます。カスタム データ コネクタは M 言語を使用して作成され、 M 言語用の新しい関数を定義することができ、新しいデータソースへの接続を可能にするために使用できます。
カスタム データ コネクタを使用すると、新しいデータソースを作成したり、既存のソースをカスタマイズしたり拡張したりできます。 一般的な使用例には次のものがあります。

  • REST API に対しビジネス・アナリストに適したビューの作成
  • 既存のコネクタ(OData サービスや ODBC ドライバなど)でサポートされているソースのブランディングを提供する
  • SaaS 製品の OAuth v2 認証フローの実装
  • 利便性を向上させるために、データソースに対して限定/フィルタリングされたビューを公開する
  • Power BI コンテンツ パックの作成時に異なる認証モードをサポート
  • ODBC ドライバによるデータソースの DirectQuery の有効化

現在、カスタム データ コネクタは Power BI Desktop でのみサポートされています。今月の Power BI Desktop のアップデートでは、この機能を有効にして、カスタム コネクタをロードすることができます。

 

プレビュー機能を有効にするには Power BI Desktop の最新版をインストールし、プレビューの機能は [ファイル] → [オプションと設定] → [オプション] の [プレビュー機能] から当該機能のチェックをオンにするのをお忘れなく。ちなみに先日の Microsoft Data Insight Summit では「Power BI Desktop が毎月更新されるのは良いんだけど、毎回ダウンロードしてインストールするのが面倒なんだよね…」という声にお応えして、自動更新可能で管理者権限のいらない Windows ストア から提供できるようにする予定が発表されています。待ち遠しいです。

SharePoint 2010 and June 2017 CU

$
0
0

Our next cumulative update packages are available. For SharePoint we are still suggesting to install a Full-Server package and that looks as follows.

New since November 2014: From now the prerequisites has been changed, you need to have Service Pack 2 installed before upgrading to the next CU. That means as well that the Service Pack 1 is out of support: https://support.microsoft.com/en-us/lifecycle/search?alpha=sharepoint%202010

The format how we are providing the Cumulative Update packages has changed with cy 2015, means you can find all packages on our Download Center.

When you reached the download website, it is independent which language you chose, the packages are the same.

MSF2010:
No CU for SharePoint Foundation 2010

SPS2010:
3191905 The full server package for SharePoint Server 2010 and contains also the MSF2010 fixes so you need only this one package.
https://support.microsoft.com/en-us/KB/3191905

Download link: https://www.microsoft.com/en-us/download/details.aspx?id=55338

Project Server 2010:
3191903 The full server package for Project Server 2010 and contains also the SharePoint Server 2010 fixes so you need only this one package.
https://support.microsoft.com/en-us/KB/3191903

Download link: https://www.microsoft.com/en-us/download/details.aspx?id=55334

Important for all Server Applications listed above:
After applying the preceding updates, run the SharePoint Products and Technologies Configuration Wizard or check the post from my colleague Stefan: Why I prefer PSCONFIGUI.EXE over PSCONFIG.EXE

Links:
Update Center for Microsoft Office, Office Servers, and Related Products
SharePoint patching demystified

Docker Blog Series Part 2 – Build & Deploy ASP.NET Core based Docker Container on Service Fabric

$
0
0

This post is the second in a series of posts on Docker that Premier Developer consultant Monu Bambroo has posted on his blog.


Azure Service Fabric in addition to offering a Service Fabric programming model is also able to orchestrate container based services across a cluster of machines. Service Fabric can deploy services in container images. In this blog post, we will see how to use Service Fabric as an orchestrator for Windows based Docker images. The images will be published to Docker Hub and consumed by the Service Fabric orchestrator. We will build an ASPNET Core Web application, Dockerize it and publish it to Docker Hub. Finally we will publish the image to a Service Fabric Cluster.

Read the rest on Monu’s blog here.

Create Bot for Microsoft Graph with DevOps 2: Get your appointment from Microsoft Graph and OAuth 2.0 with AuthBot

$
0
0

As we already setup VSTS and unit testing framework, let’s implement actual bot which can get my Outlook appointment. We can use OAuth 2.0 for Microsoft Graph, and AuthBot is the best module for our bot.

AuthBot

You can see the detail of AuthBot here.

1. Add AuthBot NuGet Package to O365Bot project. You need to check [include prerelease].

image

2. add appSettings in web.config. I will fill the values later.

<!-- AAD Auth v1 settings -->
<add key="ActiveDirectory.Mode" value="v1" />
<add key="ActiveDirectory.ResourceId" value="https://graph.microsoft.com/" />
<add key="ActiveDirectory.EndpointUrl" value="https://login.microsoftonline.com" />
<add key="ActiveDirectory.Tenant" value="" />
<add key="ActiveDirectory.ClientId" value="" />
<add key="ActiveDirectory.ClientSecret" value="" />
<add key="ActiveDirectory.RedirectUrl" value="http://localhost:3979/api/OAuthCallback" />

3. Add following code in Application_Start method of Global.asax.cs

AuthBot.Models.AuthSettings.Mode = ConfigurationManager.AppSettings["ActiveDirectory.Mode"];
AuthBot.Models.AuthSettings.EndpointUrl = ConfigurationManager.AppSettings["ActiveDirectory.EndpointUrl"];
AuthBot.Models.AuthSettings.Tenant = ConfigurationManager.AppSettings["ActiveDirectory.Tenant"];
AuthBot.Models.AuthSettings.RedirectUrl = ConfigurationManager.AppSettings["ActiveDirectory.RedirectUrl"];
AuthBot.Models.AuthSettings.ClientId = ConfigurationManager.AppSettings["ActiveDirectory.ClientId"];
AuthBot.Models.AuthSettings.ClientSecret = ConfigurationManager.AppSettings["ActiveDirectory.ClientSecret"];

You also need to add using System.Configuration; to resolve name.

Register an application to Azure AD

To use OAuth 2.0 in Microsoft Graph, you need to register the application in your Azure AD.

1. Login to https://portal.azure.com with user who can register application.

2. Select Azure Active Directory.

image

3. Click [App registration]

image

4. Click [New application registration] and enter name and sign on URL. Use http://localhost:3979/api/OAuthCallback for sign on URL

image

5. Select the created application and click [Required permissions]

image

6. Add Microsoft Graph

image

7. Add permissions. As this is a test application, I just check everything which doesn’t require Admin Privilege.

8. Once permission selected, click [Grant Permissions].

image

9. Select [Keys] and create new key. Copy the value somewhere safe.

image

10. Update the web.config values. ClientId is an application id.

Microsoft Graph

1. Add Microsoft.Graph NuGet package to O365Bot project.

image

2. Add a folder and name it as Services.

image

3. Add GraphService.cs inside the folder and replace with the below code.

using AuthBot;
using Microsoft.Bot.Builder.Dialogs;
using Microsoft.Graph;
using System;
using System.Collections.Generic;
using System.Configuration;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Threading.Tasks;

namespace O365Bot.Services
{
    public class GraphService
    {
        IDialogContext context;
        public GraphService(IDialogContext context)
        {
            this.context = context;
        }
       
        /// <summary>
        /// Get events for next 7 days.
        /// </summary>
        /// <returns></returns>
        public async Task<List<Event>> GetEvents()
        {
            var events = new List<Event>();
            var client = await GetClient();

            try
            {
                var calendarView = await client.Me.CalendarView.Request(new List<Option>()
                {
                    new QueryOption("startdatetime", DateTime.Now.ToString("yyyy/MM/ddTHH:mm:ssZ")),
                    new QueryOption("enddatetime", DateTime.Now.AddDays(7).ToString("yyyy/MM/ddTHH:mm:ssZ"))
                }).GetAsync();

                events = calendarView.CurrentPage.ToList();
            }
            catch (Exception ex)
            {
            }

            return events;
        }

        private async Task<GraphServiceClient> GetClient()
        {
            GraphServiceClient client = new GraphServiceClient(new DelegateAuthenticationProvider(AuthProvider));
            return client;
        }

        private async Task AuthProvider(HttpRequestMessage request)
        {
            request.Headers.Authorization = new AuthenticationHeaderValue(
                "bearer", await context.GetAccessToken(ConfigurationManager.AppSettings["ActiveDirectory.ResourceId"]));
        }
    }
}

4. Then update code in RootDialog.cs with following to use the service.

using System;
using System.Threading.Tasks;
using Microsoft.Bot.Builder.Dialogs;
using Microsoft.Bot.Connector;
using AuthBot;
using System.Configuration;
using AuthBot.Dialogs;
using System.Threading;
using O365Bot.Services;

namespace O365Bot.Dialogs
{
    [Serializable]
    public class RootDialog : IDialog<object>
    {
        public Task StartAsync(IDialogContext context)
        {
            context.Wait(MessageReceivedAsync);
            return Task.CompletedTask;
        }

        private async Task MessageReceivedAsync(IDialogContext context, IAwaitable<object> result)
        {
            var message = await result as Activity;

            // Check authentication
            if (string.IsNullOrEmpty(await context.GetAccessToken(ConfigurationManager.AppSettings["ActiveDirectory.ResourceId"])))
            {
                // Run authentication dialog
                await context.Forward(new AzureAuthDialog(ConfigurationManager.AppSettings["ActiveDirectory.ResourceId"]), this.ResumeAfterAuth, message, CancellationToken.None);
            }
            else
            {
                // get events
                GraphService service = new GraphService(context);
                var events = await service.GetEvents();
                foreach (var @event in events)
                {
                    await context.PostAsync($"{@event.Start.DateTime}-{@event.End.DateTime}: {@event.Subject}");
                }
            }
        }

        private async Task ResumeAfterAuth(IDialogContext context, IAwaitable<string> result)
        {
            context.Wait(MessageReceivedAsync);
        }
    }
}

Run the app

1. Run the service by pressing F5.

2. Launch Bot Emulator and connect to http://localhost:3979/api/messages. If you don’t install it yet, download it from here.

3. Send any message and you will get sign in request.

image

4. Click the button and authenticate.

image

5. Once login process completed, you see ‘magic number’ in the page. Send the number back to bot.

image

6. Once you send the number, send any message again. Yeah, I know you don’t want to send the message again, eh? I will fix it in the future blog article.

7. Confirm you got events for next  7 days. (only if you have any schedule Smile)

Summery

This time, I just added basic functionary of getting schedule, but how about unit testing? I will explain it next time.

GitHub https://github.com/kenakamu/BotWithDevOps-Blog-sample/releases/tag/article_2

Ken

Announcing Microsoft Azure Government services in the Cloud Solution Provider program

$
0
0

We are excited to announce that Azure Government services are now available through the Cloud Solution Provider program (CSP). The CSP program enables partners to make Microsoft Cloud services part of their customer offerings, expanding U.S. government customer options in meeting their mission goals. With the CSP program, partners can now create high-value service offerings that combine use of Azure Government with solution management, customer support, and billing to U.S. government customers. The security, privacy, compliance and transparency of the Azure Government platform give U.S. government partners the right foundation for meeting regulatory requirements while delivering innovation to the U.S. public sector.

The CSP program is a great fit for the U.S. public sector where partners already build, deploy and manage solutions on behalf of federal, state and local government entities. Over the last 60 days, we piloted Azure Government in CSP with a variety of partners and their customers running production workloads. The feedback reinforces the expected demand from U.S. Government customers for secure, compliant cloud-based solutions through a broad ecosystem of partners. Reduced cost, cloud speed, and increased efficiency are all potential benefits of the managed services that U.S. government partners can enable through CSP.

Smartronix and Microsoft share a common commitment to deliver innovative and valuable services to our government customers. AzureGov on CSP makes it easier for us to deliver these solutions.” – Robert Groat, Executive Vice President Smartronix 

“As compliance with regulation and public demand rapidly increase reliance on technology in the public sector, a consistent, reliable and accessible cloud platform is the backbone supporting that change, with Microsoft Azure, through the SYNNEX CLOUDSolv marketplace, we enable our partners to drive deployment and provide comprehensive solutions that help them compete and grow their business in the government vertical.” Darren Harbaugh, Vice President, Cloud & Emerging Services, SYNNEX Corporation.

Our partnership with Microsoft enabled us to seize a tremendous opportunity as we were closing the year. Not only did the fast turnaround have us transacting on the Azure Cloud in less than a week, but now we can provide our services on Microsoft’s infrastructure which opens a wave of new possibilities for new government customers.” – Eric Van Heel, VP of Cloud Solutions Support Avtex Solutions

In partnering with Microsoft through CSP and leveraging technologies such as the Azure Pricing Calculator, which is bleeding edge to provide a comprehensive and easy to use web-based user interface to select and dynamically price out all of Azure Cloud Services, Microsoft enabled us to turn around quotes and spin up resources based on our clients’ requirements within minutes not days.” – Sonoka Ho, VP of Business Operations @ TechTrend.us.

The array of services available in Azure Government is rapidly increasing, and nearly all are available today through CSP. From infrastructure services like virtual machines, storage, and networking to platform services like data, analytics, web and mobile services. Transacting Azure in the Government Cloud through CSP can be done in three steps:

  1. Learn about CSP: the requirements for participation and the option to own the customer relationship end-to-end.
  2. Decide on the model: direct or indirect.
  3. Get started: sign up, get ready to sell, transact, and support.

Once you decide which model is right for you, follow the enrollment path in CSP for Azure Government.

Become an indirect reseller

As an indirect reseller in the CSP program, you’ll work with an indirect provider (also known as a distributor). Indirect providers can provide your customers with product support, provide you with technical assistance and marketing, and help you establish financing and credit terms.

Minimum requirements include having a Microsoft Partner Network (MPN) ID and the ability to sign legal agreements on behalf of your organization.

If you don’t have the infrastructure to provide customer support and billing, you can connect with an indirect provider. This gives you more time to spend with your customers building specialized service offers. Review the authorized indirect providers in your area, to get help with value added services, support and billing. Here’s where you can learn more about the indirect reseller model and find a provider.

Become a direct partner

As a direct partner, you provide your customers with cloud services, cloud products, customer support, and you bill your customers directly. If you don’t have the infrastructure for doing this type of business, join as an indirect reseller. Minimum requirements for direct partners include: A service business model, a customer support infrastructure, customer billing and invoicing capabilities, and the ability to scale. There is a deeper commitment required to be a direct partner, and details on support requirement, billing and invoicing, managing customers, incentives and licensing can be found here.

Enroll in the CSP Program

After you understand the requirements and commitment, apply now and we’ll review your application. Please note it can take several days to review and verify your information.

Have questions, please email azgovcsp@microsoft.com and join us on Yammer.


Education Technology – Step out of your comfort zone, by Natalie Burgess

$
0
0

e-t-2This latest extract from Education Technology is written by Natalie Burgess, Microsoft Teacher Ambassador,  who discusses the best way to grow as a teacher by jumping in and taking the first step. Natalie’s current role is to train teachers across Scotland and the North of England to use technology in their classrooms. Natalie was previously a Primary School teacher for 6 years in Kirklandneuk Primary School, Renfrew.

Earlier this year, Natalie ran a webinar all about her Digital Leaders Programme and edtech in the classroom, which is available on demand: Digital Leaders in the Primary School.




Education Technology – Step out of your comfort zone, by Natalie Burgess

As a Primary 3 teacher for a number of years, I was excited and ready to change up my teaching and utilise Office 365 successfully in my classroom, within the Scottish GLOW platform. However, at the beginning some doubts filled my head- Are the pupils too young? Will they be able to use the technology without my support all the time? Do I know enough to use this technology successfully? However, with everything new comes challenges and self-doubt and the best way to grow as a teacher is to step out of your comfort zone.

What I realised was I had to jump in, take the first step and try these great tools with my pupils. Before I knew it, we were collaborating within the classroom, with other classes in the school and even beyond the classroom, all at the click of a button! I used the Microsoft Educator Community to learn some new skills and to get some great ideas. From there I trained up my Digital Leaders on all the tech so they could help me support the other staff members. This was such an important step for me to disseminate my learning and enthusiasm across my school.

After using Office 365 in my classroom consistently, I realised how incredible it was for several reasons. The pupils could collaborate on projects together in the classroom using apps like OneNote, Sway and PowerPoint. One of my first projects involved PowerPoint. I wanted to start with something simple, see how it worked and develop my teaching and learning from there. I created one PowerPoint and using the ‘share’ functionality available on Office 365, I shared this one document with all my pupils. I prepared a slide for each pupil so that they knew where they were to create their piece of work. I shared the success criteria with the pupils and asked them to present their personal research on the Vikings Topic.

With a willingness to grow, learn and make mistakes, anything is possible!

This was amazing because the pupils could work on this at home, in school and anywhere they had access to internet. Some of my pupils even used their parent’s mobile phones to continue their work at home! There was no need now to push homework as the kids were keen to access their work at home and share their successes with their parents. As a teacher, I loved that I could access this easily at home without taking 30 jotters home and could also leave valuable feedback for my pupils that they would see instantly. The children also enjoyed peer assessing each other’s work using the Comments functionality in PowerPoint.

During the National Digital Learning Week things got even more exciting! From 2nd to 6th March, I was involved in a project with schools across Scotland. We all worked together to co-create a story using Word Online on Office 365, again using the Share functionality. This was such a powerful project that taught the pupils how to work together, collaborate, use their thinking skills and have fun! Award winning Scottish authors, Theresa Breslin and Cathy MacPhail kindly agreed to write the opening chapters for each of the stories which the children absolutely loved.

These types of projects would be impossible without the power of Office 365. Having these simple enhancements like ‘Share’, ‘Comments’ and the ability to access work from any device made my lessons so much more engaging and meaningful. Suddenly we could create an audience right away by sharing our work with others via Yammer, Twitter and the Share button. The skills my pupils learned throughout all these projects were incredible and I highly recommend every teacher to have a go, try it and see the engagement happen in your classroom no matter what skill level you feel you are with technology. With a willingness to grow, learn and make mistakes, anything is possible!


Office 365 is available to students and teachers at no cost through their academic institution. Visit Office in Education to check eligibility and to download onto multiple devices.

Earlier this year, Natalie ran a webinar all about her Digital Leaders Programme and edtech in the classroom, which is available on demand: Digital Leaders in the Primary School.

Alternatively, please register to watch our other webinars based on a variety of different content here.

Azure Media Player support for audio-only and video-only streams

$
0
0

Azure Media Player itself does not support audio-only or video-only streams as per the bottom of http://amp.azure.net/libs/amp/latest/docs/features.html.  This is due to issues with the AzureHtml5JS tech.  If you attempt to play a video-only asset in Azure Media Player you will see the error, “The video playback was aborted due to a corruption problem or because the video used features your browser did not support. (0x20400003)”.

At the JavaScript level you may see the error: MEDIA_ERR_SRC_NOT_SUPPORTED

There are a couple of ways to workaround this issue depending on what tool you’re using to encode video.  The obvious one is to always make sure that you have assets that have both audio and video.  But sometimes this is impractical or doesn’t meet your needs.  If you are using the Azure web portal http://portal.azure.com/, then use the Adaptive Streaming preset with Media Encoder Standard.

This will do two things.  It will encode the video into a bitrate ladder and it will automatically insert silent audio into video-only sources or black video into audio-only sources.

If you are using the Azure Media Services Explorer, a really great tool by the way, you would right click on the asset, choose Encode asset(s) with Media Encoder Standard (MES)…  On the Preset tab, choose Adaptive Streaming profile.  That’s all you need to do unless you want to use a specific encoding profile.  To do that, choose the profile you want to use and go to the Advanced tab and check the Insert a silent audio track when input has no audio option if your source asset has no audio or Insert black video when input has no video if your source asset has no video.

If you are using the Azure Media Services SDK and are defining your encoding profile with JSON or XML, you can use the parameters described on https://docs.microsoft.com/en-us/azure/media-services/media-services-advanced-encoding-with-mes to insert what you need:

InsertSilenceIfNoAudio
InsertBlackIfNoVideoBottomLayerOnly
InsertBlackIfNoVideo

Math Braille UI

$
0
0

For sighted users, Microsoft Office applications like Word, PowerPoint and OneNote have user interface (UI) cues that reveal math zones, selected text, the insertion point (IP) if no text is selected, and the argument of the innermost math object (fraction, subscript, integral, matrix, …) that contains the IP. Math speech also reveals these properties. These sighted and speech math UI’s enable unambiguous reading and editing of mathematics. Naturally, it’s desirable to reveal these properties in refreshable braille displays as well. This post describes a promising methodology. We represent math using the Nemeth math code since it is the most efficient for the math zones used in modern applications such as Microsoft Word and LaTeX.

First here’s how these features are revealed to sighted users. In all math-enabled Office apps, the innermost math argument containing the IP is lightly shaded and selected text has the same selection background color as text not in math zones. In PowerPoint and OneNote, the math object containing the IP is shaded a bit more lightly than the argument and if the IP isn’t in a math object, the whole math zone has this lighter shading. In Word, the math zone is enclosed in a boundary and the object containing the IP doesn’t have the lighter shading. The user always knows what kind of an argument is involved just by looking at the built-up (Professional) display. This information is also conveyed in math fine-grained speech.

Refreshable braille display

A refreshable braille display typically has a row of 40 or 80 8-dot cells with the dots represented by small rounded pins that are raised by solenoids. The dots are arranged in two columns of four dots. The left column is numbered starting at the top 1, 2, 3, 7 and the right column is numbered starting from the top 4, 5, 6, 8. Like most braille codes, the Nemeth math code uses the dots 1 through 6. This leaves dots 7 and 8 for UI purposes, although dot 7 is occasionally used to indicate upper case. The Nemeth code precedes a letter with the capitalization indicator “⠠” (lone dot 6) to get upper-case letters, e.g., “⠠⠁” for “A” since “⠁” is the braille code for the letter a. So, we don’t use dot 7 to indicate upper case, at least in math zones.

The regular math braille display shows the whole math zone in braille, limited only by the number of display cells. This gives a lot of context to math braille, significantly more than math speech provides, but not as much as screen or paper.

Selection and insertion point (IP)

Typically, selected text appears with both dots 7 and 8. So if “a” is selected, it appears as “⣁”. This approach seems well suited to math expressions as well.

We’re left with needing ways to identify a math zone and the insertion point and to highlight the innermost argument containing the IP if any. Braille displays don’t have multiple shading levels, only two extra dots! They also have hot keys.

The IP needs a cell by itself to stand out. As described in the post Text Insertion Point, the IP is in between two characters in rich text, although for plain text one can get away with thinking of the IP as being on top of the character that actually follows the IP. Built-up (Professional) math text is rich text notably because it has special display constructs, such as stacked fractions, multilevel subscripts and superscripts, integrals, matrices, etc. For this purpose (and perhaps others), dots 7-8 “⣀” comprise a simple, effective IP. Admittedly this is the same as a lone selected space, but it seems to be readily distinguishable since the user usually knows when something is selected versus having an IP and s/he can easily move the IP (or hit the IP-identification hot key coming up) to check if in doubt.

Math-object arguments

To reveal the innermost argument containing the IP, one can turn on dot 8 for the characters in that argument. This is similar to the argument shading used in regular displays. To illustrate this approach, consider the fraction 1/2π, which in built-up form is given by the Nemeth braille string “⠹⠂⠌⠆⠨⠏⠼”. If the IP precedes the 2 in the denominator, the braille display would have “⠹⠂⠌⣀⢆⢨⢏⠼”.

At first the dot 8 in the denominator cells here might be confusing, but it resolves ambiguities as to whether the IP “⣀” is inside or outside of a math object. This isn’t a serious problem with fractions since the fraction start, fraction bar, and fraction end appear as the explicit braille codes ⠹,⠌,⠼, respectively, although it’s always helpful to know when the IP is in a math argument. But consider the quantity a², which is given in Nemeth braille by “⠁⠃⠆”. In Office apps and MathML, superscripts are represented by two arguments, the base and the superscript. If the IP precedes the base, is the IP at the start of the base or at the start of the superscript object? That position is ambiguous without the dot 8 option. With dot 8, you can tell the difference: in “⣀⠁⠃⠆” the IP precedes the superscript object, while in “⣀⢁⠃⠆” the IP is inside the base in front of the “a”. Distinguishing these positions is essential for unambiguous editing of mathematical text.

IP-identification hot key

Dot-8 highlighting reveals when the IP is at the start or end of an argument or somewhere in between. But it doesn’t define what kind of argument. To get this kind of information on a braille display, it’s handy to have an IP-identification hot key that flashes the name of the argument containing the IP (or “math zone” if the IP isn’t inside an argument) onto the braille display. This name needs to be localized to the current user language, while the regular braille for the math zone is globalized by nature. For example in English, depending on where the IP is in a denominator, the hot key displays “start of denominator” (⠎⠞⠁⠗⠞⠀⠕⠋⠀⠙⠑⠝⠕⠍⠊⠝⠁⠞⠕⠗), “end of denominator” or just “denominator”. This is more informative than the corresponding math speech, which only announces the kind of argument when the IP is at the end of an argument, or the kind of math object when the IP is at the start of an object. This difference occurs because fine-grained speech needs to say the character at the IP, whereas the math braille display continuously shows the characters around the IP, limited only by the number of display cells.

It might be worth having options to enable/disable dot-8 highlighting according to user preference. Even without the dot-8 highlighting, the user can resolve ambiguities by hitting the IP-identification hot key so some users might prefer to work with the simpler braille display.

Math zone identification

Lastly, how do you reveal a math zone? If the IP is inside a math-object argument, the presence of dot 8 is a good indicator. As described in the post Braille for Math Zones, math zones start with “⠸⠩” and end with “⠸⠱”. So, the start and end of a math zone are not ambiguous in math braille. In the Microsoft Office math representation, whether the IP at the start of a math zone is inside the math zone or outside is revealed by shading or the Word math-zone border, since the character position is the same for both cases. Ditto for the end of a math zone. I tried setting dot 8 for all cells in a math zone when the IP is in a math zone, but not inside an argument, but it seems too messy. So hopefully the math zone start and end delimiters will suffice; the user can hit the IP-identification hot key to find out whether the IP is in a math zone.

With these uses of dots 7 and 8 and the IP-identification hot key, you can edit virtually all levels of mathematics using a refreshable braille display in an interoperable way with sighted users. Pretty cool, eh?!

Tvorba NSIS a Inno Setup inštalátorov priamo vo Visual Studiu 2017

$
0
0

S príchodom Visual Studia 2017 sa dostáva na trh nová verzia nástroja Visual & Installer pre tvorbu inštalátorov priamo z Visual Studia. Visual & Installer je rozšírenie (extension) pre Visual Studio 2005 až 2017, ktoré umožňuje pridať NSIS a Inno Setup projekty do IDE a plnohodnotne s nimi pracovať, čím získavate všetky výhody Visual Studia spolu s vlastnosťami týchto inštalačných systémov.

Kľúčové vlastnosti:
– Nové typy projektov: NSIS Project a Inno Setup Project
– Plná integrácia do IDE Visual Studia
– Zostavenie (Build) inštalátorov priamo s inými typmi projektov (C#, C++, .NET, VB…)
– Vlastnosti projektu (Project Properties) s podporou premenných a MSBuild
– Zvýrazňovanie syntaxe (Syntax highlighting)
– IntelliSense – dopĺňanie slov (Complete Word, Ctrl + Space)
– IntelliSense – parametre (Parameter Info)
– IntelliSense – rýchle info (Quick Info, tooltipy nad premennými, funkciami, …)
– Navigácia v skripte (Navigation Bars)
– Kompletné prepojenie so zostavením riešenia (solution build, configuration manager)
– Rýchle skoky z okna chýb (Error List, dvojklik na chybu)
– Funkcie Go To, Find Definition, Declaration, …
– Rady a tipy pre písanie skriptov
– Optimalizácia kódu
– Otváranie súborov (Go To file)
– IntelliSense – útržky kódu (Code Snippets vrátane vlastných)
– Regióny (schovávanie kódu – Outlining)
– Rozpoznávanie plug-inov a hlavičkových súborov
– Plná podpora pre verziovacie systémy (SVN, GIT, …)
– Hľadanie a premenovanie referencií (References Window, premenné, funkcie, metódy, …)
– Dynamická nápoveda
– Kompatibilita medzi všetkými verziami Visual Studia
– Dizajnér pre MessageBox-y
– Konvertovanie VS Setup projektov (.vdproj) do NSIS alebo Inno Setupu!

Na stránke produktu www.visual-installer.com nájdete ďalšie informácie vrátane plnohodnotnej Trial verzie.
V prípade akýchkoľvek otázok neváhajte kontaktovať tvorcov nástroja Visual & Installer, team unSigned .

 

Miroslav Kubovčík

A wrap up from EduTECH 2017 – Sydney

$
0
0

Empowering the students of today to create the world of tomorrow.


Thank you for attending EduTECH 2017!

At Microsoft it is our mission to empower the students of today to create the world of tomorrow.

We aim to enable the right environment for better learning with an affordable and accessible learning platform, using modern tools for classroom collaboration and experiences that spark creativity.

It was a pleasure to meet you all at EduTECH, we loved hearing about how you are using our technology in your classrooms. We hope that you enjoyed the demos at our booth, workshops, and getting hands-on with our newest range of devices. Hopefully you had a chance to speak to our team of education experts, and of course, enjoy a coffee on us!

This year, EduTECH was held at the brand new ICC located in Darling Harbour, Sydney. Although the rain pelted down fairly steadily over the 2 days, spirits were not dampened and there was a buzz on the expo floor.

We held fun and interactive workshops on our stand, which were live-streamed via our Twitter and Facebook channels, giving those the chance not physically there to enjoy the content. Members of our team, including Minecraft Lead Dan Bowen, and Teacher Ambassadors Megan Townes and Troy Waller presented on their areas of expertise, and we also had guest presenters Solange Cunin from Cuberider, Rangan Srikhanta from One Education, Mark Stanley from Literatu, Rob Swaab from Vivi, and John Sietsma from Unity3D.

We were lucky to have Lee Sabow, Windows Education Product Marketing Lead out from Microsoft Corporation in Redmond, sharing the latest on Windows 10, the Creators Update, mixed reality for schools, our new device range for all students with a variety of learning styles, and the recently announced Windows 10 S.

Meanwhile, in the IT congress breakout sessions Ray Fleming, Education Industry Lead at Microsoft presented on “Disruptive Technologies – What’s Now and Next for Education”. He discussed how the advent of powerful cloud computing and changing environments will impact on school based IT, teaching, learning and even buildings and learning spaces, as well as what’s likely to transform the education sector in the coming years including prescriptive analytics; mixed reality technologies like Hololens; IoT (internet of things); Azure cognitive services and much more.

EduTECH Highlights

1. Enabling the right environment for better learning outcomes

Microsoft is working to make a better learning environment for every student, and that includes an environment that is supportive, inclusive and relevant.  We want students to learn in the way that suits them best, on tools that they will use in their future lives and careers.

There’s a Window’s 10 device to suit everyone, no matter the age, learning style or budget. Find out more with our guide to finding the right device for your school.

2. Providing an affordable and accessible learning platform

Microsoft provides a platform that is simple to set up and manage, affordable and flexible enough to meet the needs of an individual, a classroom, and entire school or school district.

Windows 10 devices support a powerful and flexible platform to easily manage and deploy apps now streamlined for security and performance with the new Windows 10 S.  There is a Windows device to suit everyone, no matter the age, learning style or budget. Find out more with our Device Finder Guide.

Our partners also have a range of affordable and flexible education devices that are simple to set up and manage with the power of Intune for Education.  One such partner is OneEducation, and founder Rangan Srikhanta demoed the Infinity:One device during EduTECH.

3. Using modern tools for classroom collaboration

With simple and accessible apps like OneNote and Microsoft Teams in Office 365 for Education, classrooms have access to a world of options to create and collaborate building the skills for the jobs of tomorrow.

Megan Townes showcased these tools at EduTECH, demonstrating how the power of Office 365 Teams, Windows 10 Creators Update and Minecraft Education Edition can engage students by bringing learning to life in and out of the classroom.

4. Experiences that spark creativity

Inspire and prepare the creators of tomorrow through fun and collaborative apps that are fast becoming valuable tools in 21st century classrooms through Minecraft: Education Edition and Paint 3D. Solange Cunin the CEO of Cuberider, spoke to teachers about creating experiences that unlock students ability to create, experiment and express ideas in new ways within the classroom.



Viewing all 5308 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>