Lens Studio

Deploy a model to Lens Studio for use in building a Snap Lens.

With a trained model ready in Roboflow, you can deploy your model to SnapML.

Task Support

The following task types are supported by the hosted API:

Task TypeSupported by Lens Studio

Object Detection

Classification

Instance Segmentation

Semantic Segmentation

Note: Only models trained using Roboflow Train 3.0 are supported. You can check if a model is trained on Roboflow Train 3.0 by checking the Versions page associated with your model.

Deploy a Model to Lens Studio

Click on “Deploy” in the Roboflow sidebar, then scroll down until you see the “Use with Snap Lens Studio” box. Click “Export to Lens Studio”.

When you click this button, a pop up will appear showing information about the classes in your model.

These classes are ordered and will be used in the next step for configuring your model in Lens Studio. Take note of the class list for future use.

In addition, two files will be downloaded:

  1. The Roboflow Lens Studio template, with which you can use your weights in an application with minimal configuration, and;

  2. Your model weights.

The Roboflow Lens Studio template is 100 MB, so downloading the template may take a few moments depending on your internet connection.

With the template ready, we can start setting up our model in Lens Studio.

Configure Model in Lens Studio

If you haven’t already installed Lens Studio, go to the Snap AR website and download the latest version of Lens Studio. With Lens Studio installed, we are ready to start configuring our model.

For this section, we will use the Roboflow Lens Studio template. But, you can use your model weights in any application with the MLController component.

Unzip the Roboflow Lens Studio template you downloaded earlier, then open up the “Roboflow-Lens-Template.Isproj” file in the unzipped folder.

When you open the application, you will see something like this:

By default, the template uses a coin counting model. For this example, we will use the playing cards model we built earlier. This application draws boxes around each prediction, but you can add your own filters and logic using Lens Studio.

Click the “ML Controller” box at the top of the left sidebar in Lens Studio:

This will open up a box in which you can configure your model for use in the application next to the preview window:

Our demo application is configured for the coin counter example. To use your own model, first click the “ML Model” box:

Then, drag the weights downloaded from Roboflow into the pop up box:

When you drag in the weights, you will be prompted with some configuration options. In the “Inputs” section of the pop up, set each “Scale” value to 0.0039. Leave the bias values as they are by default.

Then, click “Import” to import your model.

Configure Classes in Lens Studio

We now have our model loaded into Lens Studio. There is one more step: tell our model what classes we are using.

In the “Class Settings” tab below the ML Model button that we used earlier, you will see a list of classes. These are configured for a coin counter example in our demo project, but if you are working with your own Lens Studio project these values will be blank.

Here, we need to set our class names and labels. The labels must be in the order presented in the Roboflow dashboard. Here is an example of setting one of our values for the playing card application:

We need to do this configuration for each class in our model. You must specify all classes in your model so Snap can interpret the information in the model weights.

Now our application is ready to use! You can use the “Preview” box to use your application on your computer, or demo your application on your own device using the Pairing with Snapchat feature.

Last updated