Skip to content

Latest commit

 

History

History
51 lines (40 loc) · 5.18 KB

tag-properties.md

File metadata and controls

51 lines (40 loc) · 5.18 KB

AR view tag options

🔙

The <AR> view tag extends ContentView which means you can add regular NativeScript properties like style, row, col, and horizontalAlignment as usual.

But to help add behavior to the AR experience, here are the properties and events unique to the <AR> tag:

Properties

property default description
trackingMode WORLD One of the options in the ARTrackingMode enum: WORLD, IMAGE, FACE. If you want to track images (and replace them in-view with fi. a box, video, or model) then use IMAGE. Use FACE if you want to augment faces (with the front camera).
trackingImagesBundle undefined Only used when trackingMode is IMAGE. The bundle of images you want to recognize. See the Pokémon demo app for iOS, as well as the screenshot below. For Android, we refer to the assets folder.
debugLevel NONE One of the options in the ARDebugLevel enum: NONE, WORLD_ORIGIN, FEATURE_POINTS, PHYSICS_SHAPES.
planeDetection NONE To make the plugin detect planes in WORLD tracking mode, set this to either HORIZONTAL or VERTICAL (the latter is not currently supported on Android).
showStatistics false Draw a few statistics at the bottom.
faceMaterial - A string referencing a texture for the face when trackingMode is FACE`.
planeMaterial - A string referencing a texture for the planes. For instance, the demo uses 'tron'. Can also be a Color or ARMaterial instance. You won't see the planes if not set.
planeOpacity 0.1 Determines how transparent the planes are, where 0 is invisible, and 1 is 'solid'.

trackingImagesBundle

Note that especially on Android detection is a bit picky. The most important thing is to make sure the image you're trying to recognise in the real world is flat and at least about 15 x 15 centimeters tall.

However, to help Android speed up loading the images you want to track and improve its accuracy you can create an `.imgdb file like we did here with the arcoreimg tool:

#build from folder
~/Toolshed/arcore-android-sdk/tools/arcoreimg/macos/arcoreimg build-db --input_images_directory=demo-pokemon/App_Resources/Android/src/main/assets/PokemonResources/ --output_db_path=demo-pokemon/App_Resources/Android/src/main/assets/PokemonResources

#modify -imglist.txt and fix names and append '|0.05' to set width 
#rebuild from list
~/Toolshed/arcore-android-sdk/tools/arcoreimg/macos/arcoreimg build-db --input_image_list_path=demo-pokemon/App_Resources/Android/src/main/assets/PokemonResources/-imglist.txt --output_db_path=demo-pokemon/App_Resources/Android/src/main/assets/PokemonResources/

Events

event event data description
arLoaded ARLoadedEventData Triggered when the AR view has been drawn.
planeDetected ARPlaneDetectedEventData Triggered when a new plane was detected.
planeTapped ARPlaneTappedEventData Triggered when a plane was tapped by the user. Will return the x, y, and z coordinates in the 3D space.
sceneTapped ARSceneTappedEventData Triggered when a scene was tapped by the user. Will return the x and y screen coordinates.
trackingImageDetected ARTrackingImageDetectedEventData Only used when trackingMode is IMAGE. Triggered when one of the images in trackingImagesBundle was found. You can make the image interactive in various ways. Currently, I've added the ability to play a video, add a model, or add a box at the exact spot the image was found. Please request more features, so I know what to build.
trackingFaceDetected ARTrackingFaceEventData Only used when trackingMode is FACE. Continuously triggered when a face is detected. Dump the returned properties property of the ARTrackingFaceEventData object to see what's returned. If you need more properties, let us know. There's also a faceTrackingActions which you can use to attach stuff like models and text to the face being tracked. See the demo for an example.

Continue reading