📜 ⬆️ ⬇️

Siri integration or “That's what I managed to find in your application”



At WWDC 2016, Apple introduced the world of SiriKit - an API for working with voice assistant.

If you did not watch the WWDC session about SiriKit and expect that you can use Siri in any application, then you should know that at the moment there are only a few types of services supported:
')
1) Audio and video calls,
2) Messages
2) Payments
3) Search for photos,
4) Training
5) Travel (booking).

Also, as the documentation says, there is the possibility of interaction with the car using CarPlay (INSetClimateSettingsInCarIntent, INSetSeatTemperatureInCarIntent, etc.).

Thus, Siri can be given the command "<call something, send a message, look for a photo, etc.> through <name of your application>".
Everything is arranged in such a way that it is not necessary to interact directly with the neural network - the SDK provides simple protocols and a set of lightweight classes for transmitting information in methods. The developer needs only to implement these protocols.

For the lazy at the end of the article link to the demo application (send via Siri a message to your friends from VK).

The cornerstone of SiriKit - Intent (android developers, hello!). An object of class INIntent (or its heirs) is the input data that Siri generates. INIntent itself contains nothing more than an identifier. The context role is delegated to its subclasses. The specific subclass depends on the type of application. For example, for an application with workouts, INStartWorkoutIntent contains information about the goal — how much time you need to spend on this or that exercise, where the workout takes place, and so on. For a photo service, you can use the INSearchForPhotosIntent intent, which contains a geotag (CLPlacemark), the date the photo was created, the list of people in the photo, etc.

To process the incoming intent, the developer will need to create objects of the INIntentResultResult class (or its subclasses).

There are three stages of processing input information:


At each of these stages, we get an intent and must return the corresponding INIntentResolutionResult.

For example, if we are developing a messaging application, then at the refinement stage we uniquely identify the users to whom we send the message (resolveRecipientsForSendMessage), the message content (resolveContentForSendMessage). At the confirmation stage, we check (confirmSendMessage) that everything is ready to be sent (for example, the user is authorized). And finally, at the execution stage (handleSendMessage), we send a message to selected recipients.

At any stage there are two options for the development of the scenario: positive and negative. The choice of the path is delegated to the developer: Siri provides the processed data as an intent object, and the programmer decides what result to return to the system.

Intents Extension


In order for your application to work with Siri, you will need to add an NSSiriUsageDescription key to the InfoPlist with text explaining to the end user why your application needs access to Siri (similar to NSLocationUsageDescription for geolocation).



And request permission:

[INPreferences requestSiriAuthorization:^(INSiriAuthorizationStatus status) { }]; 

After that, you need to add to the project target with the type IntentsExtension.



After adding the target, pay attention to its structure. The entry point for intents is an object of class INExtension. Inside the class instance in the method (handlerForIntent), which object will select the incoming intent will be selected. In InfoPlist of the target the name of the class of the successor INExtension and the supported types of intents are written. You can also specify which types of intents will not be available on the locked screen.



To handle a specific type of intent, you need to implement a specialized protocol in your class (for example, INSendMessageIntentHandling). The protocol contains the methods necessary for passing the above steps (resolve, confirm, handle).

Consider the implementation of the handler class for the message sending intent.

As described above, in the first stage (clarification) it is required to unambiguously identify the users and the contents of the message. If we talk about the content of the message, in most cases it is enough to know that it is not empty. But with the recipients is not so simple. In the intent, we can get a list of users that Siri recognized. This is an array of objects of the INPerson class. For each recipient, you must find a match in the list of existing recipients.

There are three options for the development of events:


Note - after selecting a user in the case of disambiguation, the resolveRecipientsForSendMessage method will be executed again. When checking the coincidence of users need to consider that the method can be called several times.

At the confirmation stage, we check the necessary conditions for the possibility of sending a message. Finally, in the handleSendMessage method (final stage), the message is sent. At each stage there is a positive and negative work scenario.

Siri does not always understand what we want to say. To help her, you can use Vocabulary - a glossary of terms. They are of two types: static and dynamic. In the first case, we provide a special AppIntentVocabulary.plist, which contains common terms. The dynamic dictionary is filled with specialized terms for the current user:

 [[INVocabulary sharedVocabulary] setVocabularyStrings:[usersNames copy] ofType:INVocabularyStringTypeContactName]; 

The dictionary is not responsible for the Intent Extension, but the main application. That is, your additional terms need to be specified before launching the Siri extension.

Intents UI Extension


In addition to the data processing logic, SiriKit provides the ability to change the data display interface. For this, there is an Intents UI Extension. By analogy with the Intents Extension, InfoPlist UI-target contains a list of supported intents, as well as the name of the storyboard file.



Note - for any intent in the UI extension, the same controller (entry point in the storyboard) will be created. For delegation of display logic, it is recommended to use child controllers (child view controllers). The base controller must support the INUIHostedViewControlling protocol, which defines the interface configuration method - configureWithInteraction. Consider the parameters of this method.


In addition to the main protocol INUIHostedViewControlling, there is an additional protocol INUIHostedViewSiriProviding, which allows you to control the display of standard Siri interfaces for messages (displaysMessage) and maps (displaysMap). Apple’s requirements for creating interfaces for Siri have a ban on displaying advertisements in snippets. If you plan to use animation, it is recommended to perform it in viewDidAppear.

To transfer data between the main application and extensions, you still need to use Application groups (example in demo).

Screenshots






Links


SiriKit Programming Guide
Demo application

Source: https://habr.com/ru/post/303886/


All Articles