Flutter and AR

Getting started with Augmented Reality in Flutter using ARCore

What Is Augmented Reality?

According to Wikipedia,

Augmented reality is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory.

The basic idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time.

Besides, haven’t television networks been doing that with graphics for decades?

Yes, But AR is more advanced than any technology you have seen in the television broadcasting. These systems display graphics for only one point of view. Next-generation augmented-reality systems will display graphics for each viewer’s perspective.

There are many AR platforms available that can help you to build amazing AR applications easily, like ARCore(Google), ARKit (Apple), Vuforia, EasyAR, etc.

For this article I’ll be using ARCore with flutter. Thankfully, Gian Marco Di Francesco has created a flutter plugin for this. You can check out his twitter profile for more demos.

We can use that to ease up our task.


Gian has written a pretty explanatory blog that explains how to do the setup configuration. Along with those settings I had to install Google Play Services for AR on my android device to make my application behave properly.

Code Snippet

The below code is similar to what Gian has explained in his examples. I have tinkered his code a bit so that I can notify my API server whenever I tap on the AR object. This can allow me to integrate my existing python API services to execute whenever I tap on the AR object.

Let’s spend some time in understanding the code now.

  1. Import all required packages.
import 'package:flutter/material.dart';
import 'package:vector_math/vector_math_64.dart' as vector; import 'package:arcore_flutter_plugin/arcore_flutter_plugin.dart'; import 'package:http/http.dart' as http;

2. Create controller.

ArCoreController arCoreController;

3. Make the body for the AR view.

body: ArCoreView(                                 enableTapRecognizer: true ,                               onArCoreViewCreated: _onArCoreViewCreated,                             ),

Here I have set enableTapRecognizer as true and whenever the view is created _onArCoreViewCreated function will be executed that will further do some tasks.

4. _onArCoreViewCreated function

_onArCoreViewCreated(ArCoreController _arcoreController){
this.arCoreController = _arcoreController;
this.arCoreController.onNodeTap = (nodes) =>onNodeTapHandler(nodes);_addCyclinder(arCoreController);_addSphere(arCoreController);

This function sets another function that will be executed whenever a node will be tapped.

Also this calls 2 functions that will create a cylinder object and a sphere object and then add those to the AR controller we created in step 2.

5. What happens when a node is tapped.

void onNodeTapHandler(String name) {                      print(name);
url = "$name";
var res = http.get(Uri.encodeFull(url));

This function is the one that gets executed whenever a AR node is tapped.

This takes the name of the node and then passes the name to the API server. (Whose IP I have hardcoded for now — do change it for your case!)

6. How does this function/handler comes to know what name to pass to the API server.

Each node has an ID or name that is used by AR controller to manage them internally. But we can also give a custom name to each node that will help us to trigger specific API endpoints on our server.

Line 55 — gives a custom name for the node

The above function is responsible to create a cylinder node(line 54) and add that node to _arcoreController (line 63).

To create the node, there are few things which are required

a) position of the node according to the camera.

b) shape of the node to be created.

You can specify more properties to the shape like material properties, dimensions, etc.

7. And yes, like most of the widgets in flutter these nodes can have children

check line 79

Here I have created 2 nodes in the same function — moon and node; And node has got moon as its one of the children (children is a simple list).

As you can notice that there are 3 nodes now, all attached to a single controller. And there is no name given to the moon node. Try taping on the node and check its name on the console(line 45) and check the name for other 2 nodes.

There you have it! We’ve successfully created a AR application with flutter that can call your API endpoints. They can be anything from manipulating docker or any complex infrastructure, IoT device endpoints, or even other data analytics and machine learning projects.

I hope you enjoyed building it. If you followed along, let me know how it went. And your use-cases for the same.

Happy coding!