Uncategorized

Build Your Teams Calling Bot (IVR application) with Microsoft Graph

In my early post, I explained about the developer platform to build interactive voice applications using Skype for Business. But now as part of new Microsoft vision for intelligent communications, Microsoft Teams is replacing Skype for Business.
In this post I explain about new Teams calling API with Microsoft Graph for your beginning.

With new Microsoft Graph Calling API, you can build your advanced calling application on new Teams infrastructure, such like IVR applications including DTMF (Dual-tone multi-frequency), playing media, call transferring, and so on.
Not only an ordinary bot, but also you can build applications with the following advanced scenarios. :

  • You can also assign a phone number to your bot and make a bot reachable by phone. (See here.)
  • Your application (not human) can also join an online meeting in Teams and interact with audio.

As I show you later in this post, you can easily build your calling applications using SDK library, but here I show you the raw REST calling flows to make you understand how this API works at the bottom.

Service-hosted and Application-hosted

In Teams calling application, there are two types of applications : one is service-hosted (remote media) application and another is application-hosted (local media) application.

Service-hosted application can be implemented as a simple HTTP-based application. On the other hand, application-hosted is TCP-based and can manipulate more advanced media operations, such as real-time streaming, screen sharing, so on and so forth with a primitive media programming on Windows platform. (Then it’s better to use wrapped API for building application with application-hosted media.)

Here in this post, I show you an example with service-hosted media.

Preparation (Registration)

The calling bot has 2 legs of Azure Active Directory (Azure AD) endpoints : one is bot’s webhook endpoint which receives user’s message or state notifications as incoming requests, and the other is Microsoft Graph’s endpoint which requests several calling operations, such as playing, recording, and transferring using Microsoft Graph. (See the following picture.)

Therefore you must register your Azure AD application for these 2 endpoints. (Of course, you can also consolidate these 2 endpoints into one endpoint.)
The following is the brief outline of preparation tasks for your calling application and see “Microsoft Teams : Registering a calling bot for Microsoft Teams” for details.

  1. Register new bot (Bot Channels Registration) in Azure Portal and set calling configuration in Teams channel. (By registering your bot into Bot Service, corresponding Azure AD application is also registered into Azure AD.)
  2. Register new Azure AD application for calling Microsoft Graph API. In app registration settings, add “Calls” – “Calls.Initiate.All” and “Calls.AccessMedia.All” (which is needed for recording and tone subscription) permissions in “Microsoft Graph” application permissions, and grant admin consent for this permission. (Use integrated new registration UI (currently, Preview) in Azure Portal.)
  3. Enable developer preview in your debugging Teams client.
  4. Generate manifest with the following bot settings (enabling “supportsCalling”), create zip package, and upload package into your Teams client.
{  ...  "bots": [{  "botId": "test01",  ...  "supportsCalling": true,  "supportsVideo": true,  ...}  ]  ...}

After you’ve installed package, you can see the following voice experience in your bot.
You can start to call this bot !

Initiate Call Communications

Now I show you how your bot can interact with new calling frameworks.

1. Start Communications

First your bot will receive a request by users as the following webhook.
The following “2d1a0600-d2bc-40bf-ab9a-ff8119f1282f” is generated calling communication’s id (which is unique in each communications) and your bot should request operations (such as playing, recording, transferring, etc) to “/app/calls/{communication id}” endpoint in Microsoft Graph. (See the passed “resource” attribute in the following HTTP request body.)

As I explained in my early post “BUILD BOT with Bot Builder Rest Api (Azure Bot Service)“, the message from Bot Service platform is protected by “Authorization” header value and your bot must verify this value for secure communication. (If you ignore this header, your code might be called by the malicious program …) See “BUILD BOT with Bot Builder Rest Api (Azure Bot Service)” for this validation steps.

POST https://example.com/yourbotAccept: application/jsonAuthorization: Bearer eyJhbGciOi...Content-Type: application/json; charset=utf-8{  "@odata.type": "#microsoft.graph.notification",  "changeType": "created",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "resourceData": {"@odata.type": "#microsoft.graph.call","state": "incoming","direction": "incoming","callbackUri": "https://example.com/yourbot","source": {  "@odata.type": "#microsoft.graph.participantInfo",  "identity": {"@odata.type": "#microsoft.graph.identitySet","encrypted": {  "@odata.type": "#microsoft.graph.identity",  "id": "11vxrihb5t...",  "tenantId": "65652f5f-79bf-47a7-...",  "identityProvider": "None"}  },  "region": "amer",  "languageId": ""},"targets": [  {"@odata.type": "#microsoft.graph.participantInfo","identity": {  "@odata.type": "#microsoft.graph.identitySet",  "application": {"@odata.type": "#microsoft.graph.identity","id": "4efee072-68f6-4876-8870-691dfaa0f1cc","tenantId": null,"identityProvider": "AAD"  }}  }],"tenantId": "65652f5f-79bf-47a7-...","myParticipantId": "6e6ea994-3f6a-4849-a64b-584c8fe5ba7a","id": "2d1a0600-d2bc-40bf-ab9a-ff8119f1282f"  }}
HTTP/1.1 202 Accepted

Instead of passive communication as above, your bot can also start (instantiate) communications against Microsoft Graph endpoint as follows. (The following “37706640-fcc4-4d72-bcf8-3558130ccabf” is the object id for a targeting user.)
Note that the value of this “Authorization” header is not the same as previous one. (See the next “Answer and Establish” section for this value.)

POST https://graph.microsoft.com/beta/app/callsAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json{  "@odata.type": "#microsoft.graph.call",  "direction": "outgoing",  "callbackUri": "https://example.com/yourbot",  "source": {"@odata.type": "#microsoft.graph.participantInfo","identity": {  "@odata.type": "#microsoft.graph.identitySet",  "application": {"@odata.type": "#microsoft.graph.identity","id": "4efee072-68f6-4876-8870-691dfaa0f1cc","displayName": "Test Bot"  }}  },  "targets": [{  "@odata.type": "#microsoft.graph.participantInfo",  "identity": {"@odata.type": "#microsoft.graph.identitySet","user": {  "@odata.type": "#microsoft.graph.identity",  "id": "37706640-fcc4-4d72-bcf8-3558130ccabf"}  }}  ],  "requestedModalities": ["audio"  ],  "tenantId": "65652f5f-79bf-47a7-..."}
HTTP/1.1 201 CreatedContent-Type: application/json;odata.metadata=minimal;odata.streaming=true;IEEE754Compatible=false;charset=utf-8Location: https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f{  "@odata.context": "https://graph.microsoft.com/beta/$metadata#app/calls/$entity",  "state": "Establishing",  "terminationReason": null,  "direction": "Outgoing",  "ringingTimeoutInSeconds": null,  "subject": null,  "callbackUri": "https://example.com/yourbot",  "requestedModalities": ["Audio"  ],  "activeModalities": [  ],  "routingPolicies": [  ],  "tenantId": "65652f5f-79bf-47a7-...",  "myParticipantId": "01031098-cc53-4aaf-a894-0fc89c4e0694",  "id": "2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "error": null,  "resultInfo": null,  "answeredBy": null,  "chatInfo": null,  "meetingInfo": null,  "meetingCapability": null,  "toneInfo": null,  "callRoutes": [  ],  "source": {"region": null,"languageId": null,"identity": {  "user": null,  "device": null,  "phone": null,  "application": {"id": "4efee072-68f6-4876-8870-691dfaa0f1cc","displayName": "Test Bot"  }}  },  "targets": [{  "region": null,  "languageId": null,  "identity": {"application": null,"device": null,"phone": null,"user": {  "id": "37706640-fcc4-4d72-bcf8-3558130ccabf",  "displayName": null}  }}  ]}

2. Answer and Establish

To invoke operations (playing, recording, transferring, etc) against established communication, your bot must invoke Microsoft Graph endpoint. Even when your bot accepts or rejects your receiving requests, you must call this endpoint.

Before invoking Microsoft Graph in your bot, first your bot should get the OAuth access token using Azure Active Directory (Azure AD). In this case, your bot must use the client credential authentication using application permissions (which has the organization-level access privileges and strong permissions) without interactive login UI.
In this post I don’t explain about details for this authentication flow, but you can see my early post “Backend (Daemon) App calling API protected by Azure AD” for this authentication flow.

The following is accepting (answering) the previous incoming communication request using Microsoft Graph. (The following “Authorization” header value must be the previously mentioned token with application permissions.)
The subsequent callback (webhook) is notified against “callbackUri” (see below). For instance, you can also specify some query string in this uri to identify a calling instance in the callbacks.

Note : When your bot wants to reject incoming requests, please set “reject” in uri instead of “answer”.

POST https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/answerAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json{  "callbackUri": "https://example.com/yourbot",  "acceptedModalities": ["audio"  ]}
HTTP/1.1 202 Accepted

Once you’ve answered (accepted) incoming requests, the state (“establishing” -> “established” -> “terminated” (hang up)) is reported as webhooks against previous “callbackUri” as follows.
Your bot must wait until the communication is established.

POST https://example.com/yourbotAuthorization: Bearer eyJhbGciOi...Content-Type: application/json; charset=utf-8{  "@odata.type": "#microsoft.graph.notification",  "changeType": "updated",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "resourceData": {"@odata.type": "#microsoft.graph.call","state": "establishing"  }}
HTTP/1.1 202 Accepted
POST https://example.com/yourbotAuthorization: Bearer eyJhbGciOi...Content-Type: application/json; charset=utf-8{  "@odata.type": "#microsoft.graph.notification",  "changeType": "updated",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "resourceData": {"@odata.type": "#microsoft.graph.call","state": "established","replacesContext": "aHR0cHM6Ly..."  }}
HTTP/1.1 202 Accepted

The state is also notified for other activities (playing prompts, recording, etc) and your bot must frequently handle these notifications.

Calling Scenarios

Once your communication is established, now your bot can request several operations against Microsoft Graph.

Play Prompting

First example is prompting an user with voice message. (See the following HTTP request and response.)
Here we’re setting static voice message using .wav file, but of course you might dynamically generate audio binary using Cognitive Speech Services with CGI, etc.
The returned “8ddca5c5-c97e-4305-88b5-98bf9a0ec223” is the operation id for this prompting operation.

POST https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/playPromptAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json{  "prompts": [{  "@odata.type": "#microsoft.graph.mediaPrompt",  "mediaInfo": {"@odata.type": "#microsoft.graph.mediaInfo","uri": "https://myfilelocation.com/audio/greeting.wav"  },  "loop": 1}  ],  "clientContext": "b8b347c1-45fc-401f-91bc-e43f469636c9"}
HTTP/1.1 200 OKContent-Type: application/json;odata.metadata=minimal;odata.streaming=true;IEEE754Compatible=false;charset=utf-8Location: https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/operations/8ddca5c5-c97e-4305-88b5-98bf9a0ec223{  "@odata.context": "https://graph.microsoft.com/beta/$metadata#CommsOperation",  "@odata.type": "#microsoft.graph.CommsOperation",  "clientContext": "b8b347c1-45fc-401f-91bc-e43f469636c9",  "status": "Running",  "createdDateTime": "2018-12-19T09:24:16.9812203Z",  "lastActionDateTime": "2018-12-19T09:24:16.9812203Z",  "id": "8ddca5c5-c97e-4305-88b5-98bf9a0ec223",  "errorInfo": null}

Your bot is notified as follows when the prompting is completed. (The prompting is completed and the operation is deleted.)

POST https://example.com/yourbotAccept: application/jsonAuthorization: Bearer eyJhbGciOi...Content-Type: application/json; charset=utf-8{  "@odata.type": "#microsoft.graph.notification",  "changeType": "deleted",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/operations/8ddca5c5-c97e-4305-88b5-98bf9a0ec223",  "resourceData": {"@odata.type": "#microsoft.graph.playPromptOperation","prompts": [  {"@odata.type": "#microsoft.graph.mediaPrompt","mediaInfo": {  "@odata.type": "#microsoft.graph.mediaInfo",  "uri": "https://myfilelocation.com/audio/greeting.wav",  "resourceId": "f829b9e3-709d-472e-8a11-b61d783de223"},"loop": 1  }],"clientContext": "b8b347c1-45fc-401f-91bc-e43f469636c9","status": "completed","createdDateTime": "2018-12-19T09:24:16.9812203+00:00","lastActionDateTime": "2018-12-19T09:24:26.1613009+00:00","id": "8ddca5c5-c97e-4305-88b5-98bf9a0ec223"  }}
HTTP/1.1 202 Accepted

Record

Next example is the sample of recording and retrieving user’s response (voice).
With the following HTTP request, audio message (question01.wav) is played and the beep is made for user, before recording. After the user speaks some message and pushes tone “#”, the recording will be finished and saved as audio data.

POST https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/recordAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json{  "prompts": [{  "@odata.type": "#microsoft.graph.mediaPrompt",  "mediaInfo": {"@odata.type": "#microsoft.graph.mediaInfo","uri": "https://myfilelocation.com/audio/question01.wav"  },  "loop": 1}  ],  "playBeep": true,  "streamWhileRecording": false,  "stopTones": ["#"  ],  "clientContext": "b8b347c1-45fc-401f-91bc-e43f469636c9"}
HTTP/1.1 200 OKContent-Type: text/plainLocation: https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/operations/aa7939be-d586-41c2-b4c9-500ff495c5b2

After the recording is finished (i.e, the user has responded), the following webhook will arrive into your bot.
Your bot can retrieve recorded user’s voice from “recordResourceLocation” with token “recordResourceAccessToken”. For instance, your bot may convert audio binary into text using Cognitive Speech Services and get meanings using Cognitive Language Understanding.

POST https://example.com/yourbotAccept: application/jsonAuthorization: Bearer eyJhbGciOi...Content-Type: application/json; charset=utf-8{  "@odata.type": "#microsoft.graph.notification",  "changeType": "deleted",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/operations/aa7939be-d586-41c2-b4c9-500ff495c5b2",  "resourceData": {"@odata.type": "#microsoft.graph.recordOperation","prompts": [  {"@odata.type": "#microsoft.graph.mediaPrompt","mediaInfo": {  "@odata.type": "#microsoft.graph.mediaInfo",  "uri": "https://myfilelocation.com/audio/question01.wav",  "resourceId": "8e9193ea-e7ff-4900-b133-42481ea0b8b0"},"loop": 1  }],"initialSilenceTimeoutInSeconds": 5,"maxSilenceTimeoutInSeconds": 5,"maxRecordDurationInSeconds": 300,"playBeep": true,"streamWhileRecording": false,"stopTones": [  "#"],"recordResourceLocation": "https://somefilelocation/dc63b717-7829-42aa-80aa-a5b912ac3b0c","recordResourceAccessToken": "eyJhbGciOi...","completionReason": "stopToneDetected","clientContext": "b8b347c1-45fc-401f-91bc-e43f469636c9","status": "completed","createdDateTime": "2018-12-19T07:47:50.6059503+00:00","lastActionDateTime": "2018-12-19T07:48:07.9947343+00:00","id": "aa7939be-d586-41c2-b4c9-500ff495c5b2"  }}

Subscribe Tones

Your bot can also request tones (keypad input) for users as follows.

POST https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/subscribeToToneAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json{"clientContext":"b8b347c1-45fc-401f-91bc-e43f469636c9"}
HTTP/1.1 200 OKContent-Type: application/json;odata.metadata=minimal;odata.streaming=true;IEEE754Compatible=false;charset=utf-8Location: https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/operations/a37a2e0c-22f4-48ec-a030-722532f00e1c{  "@odata.context": "https://graph.microsoft.com/beta/$metadata#CommsOperation",  "@odata.type": "#microsoft.graph.SubscribeToToneOperation",  "clientContext": "b8b347c1-45fc-401f-91bc-e43f469636c9",  "status": "NotStarted",  "createdDateTime": "2018-12-19T05:19:54.6082698Z",  "lastActionDateTime": "2018-12-19T05:19:54.6082698Z",  "id": "a37a2e0c-22f4-48ec-a030-722532f00e1c",  "errorInfo": null}

When the user pushes tone, the following webhook will arrive in your bot. In this case, the user pushed “0”.

POST https://example.com/yourbotAccept: application/jsonAuthorization: Bearer eyJhbGciOi...Content-Type: application/json; charset=utf-8{  "@odata.type": "#microsoft.graph.notifications",  "value": [{  "@odata.type": "#microsoft.graph.notification",  "changeType": "updated",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "resourceData": {"@odata.type": "#microsoft.graph.call","state": "established","toneInfo": {  "@odata.type": "#microsoft.graph.toneInfo",  "sequenceId": 1,  "tone": "tone0"}  }}  ]}
HTTP/1.1 202 Accepted

Transfer

Your bot can also transfer connection into some user in Azure AD organization (directory) as follows. If the human operator is needed for the communication, your bot can transfer current connection (calling resource) into some user in your organization.
Following “854315c0-283c-428f-9104-fc9325cc4fad” is the targeting user’s object id in Azure AD tenant.

POST https://graph.microsoft.com/beta/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f/transferAuthorization: Bearer eyJ0eXAiOi...Content-Type: application/json{  "transferTarget": {"@odata.type": "#microsoft.graph.invitationParticipantInfo","identity": {  "@odata.type": "#microsoft.graph.identitySet",  "user": {"@odata.type": "#microsoft.graph.identity","id": "854315c0-283c-428f-9104-fc9325cc4fad"  }}  }}
HTTP/1.1 202 Accepted

After the connection is transferred and the communication between users is established, the calling resource between your bot and the user is terminated by framework.
The following is the webhook sequence for state changes.

POST https://example.com/yourbotAccept: application/jsonAuthorization: Bearer eyJhbGciOi...Content-Type: application/json{  "@odata.type": "#microsoft.graph.notification",  "changeType": "updated",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "resourceData": {"@odata.type": "#microsoft.graph.call","state": "transferring"  }}
HTTP/1.1 202 Accepted
POST https://example.com/yourbotAccept: application/jsonAuthorization: Bearer eyJhbGciOi...Content-Type: application/json; charset=utf-8{  "@odata.type": "#microsoft.graph.notification",  "changeType": "updated",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "resourceData": {"@odata.type": "#microsoft.graph.call","state": "transferAccepted"  }}
HTTP/1.1 202 Accepted
POST https://example.com/yourbotAccept: application/jsonAuthorization: Bearer eyJhbGciOi...Content-Type: application/json{  "@odata.type": "#microsoft.graph.notification",  "changeType": "deleted",  "resource": "/app/calls/2d1a0600-d2bc-40bf-ab9a-ff8119f1282f",  "resourceData": {"@odata.type": "#microsoft.graph.call","state": "terminated","terminationReason": "AppTransferred"  }}
HTTP/1.1 202 Accepted

 

You can build requests with a few lines of code using SDK, and try official samples in GitHub.

 

Reference :

Microsoft Graph – Working with the calls in Microsoft Graph
https://docs.microsoft.com/en-us/graph/api/resources/call?view=graph-rest-beta

Ignite 2018 – Introduction to programmable voice and video in Microsoft Teams
https://www.youtube.com/watch?v=19uCUjGI-0A&feature=youtu.be

Categories: Uncategorized

Tagged as: ,

7 replies»

  1. Hi Matsuzaki,

    I am building a unified communication channel platform.

    I had used Incident bot (official sample) to receive an incoming call from a user (PSTN and Teams user). And also tried call control functionality like the answer, transfer, mute, unmute, outbound call, recording, etc. It worked as expected.

    Now instead of handling call control functions like the answer, reject, transfer directly from the bot, I want to forward that incoming call events to my own application.
    My application will use Microsoft.Graph.Communication.Call to handle the call flow on the incoming events from bot application, as currently, Incident Bot is doing in the sample application. So that the incoming call events flow through the workflow of my application and find an agent based on the business logic and then the agent will answer, reject, transfer, etc the call.

    Is it possible to approach this method? If yes, could you please help me with some guidance.

    I am having a problem with posting ICall data to my own custom application by using HttpClient.
    Is there any easy to send the call instance i.e call events to my own application so that I can execute the call APIs from my own application?

    Like

Leave a Reply