Skip to content

welcome

ngambino0192 edited this page Jan 5, 2019 · 2 revisions

Welcome to the street-AR wiki!

This is some info on how to get the app running on your local machine.

What you will need installed to get this app running locally:

  1. Node.js
  2. NPM
  3. Expo CLI
  4. Expo app installed on your mobile device

Steps to get you up and running:

  1. Clone the repository locally
  2. Install all npm dependancies via cli (npm install)
  3. Add .env file in the project's root (shoot me a DM and I can encrypt this info over)
  4. In your cli, navigate to the project directory and execute either 'expo start' or 'npm start' commands
  5. If you are using an iPhone, you can open up your camera app, and hover over the barcode in your console. You will get an alert from expo, which you can click on, and it will then open up the Expo app and begin unbundling the code.
  6. If you are using an Android device, you can open up the expo app, then take a picture of the barcode. This will then begin to unbundle the code.

Once the code has finished unbundling, the app should be running!

** NOTE: If you have your directory cloned into an area of your machine that automatically backs up your data to iCloud or similar service, you mobile device will be stuck in a bundling loop. To fix this, either move the project directory into a folder that does not get automatically backed up, or turn off this feature while using the app.

Structure and Data flow:

This is a React Native application that uses the Expo SDK tool-chain.

We are using Amazon Web Services to host both an EC2 instance to handle API requests, as well as an S3 bucket to hold and serve up our image data. Computer vision is then handled by a service called Clarifai, which we then pass our image data through in order to analyze what the user's camera sees, then redirect them accordingly.

We currently have not built out a database, and are maintaining our data entirely within a Redux store.

For the AR portion of the app, we are using the 'expo-three' and 'expo-graphics' libraries. These are built into the Expo SDK. As this is are relatively new libraries, this portion of the app is only accessible to iOS users. Android users should still be able to use the rest of the app as expected. Hopefully this will get resolved in a future SDK update!

Here is a diagram for you visual people:

** ENTER DIAGRAM HERE **

Now for a basic walkthrough of the App:

Currently, the app is built over 5 different screens, each of which has a corresponding file/component within the 'screens' directory. These screens are connected through the two files within the 'navigation' directory.

HOME

Just a simple home screen. Users can click the "Find Artwork" button to see a list of all murals that are currently in our Redux store. You can scroll through these murals, and clicking one will redirect you to the corresponding artwork detail in the 'Scroll' screen.

We also built out a simple search bar, where you can type in a specific mural that you are looking for, then click on the "Find Artwork" prompt, where the results will filter down respectively.

MAP

This is a map that renders out all of the murals within the Redux store. Each pin represents a mural, that when clicked, will expand to display a piece of artwork. When this is clicked, the user is redirected to the appropriate detail page on the 'Scroll' screen.

SCROLL

This is list of all the murals that are stored. They each have a corresponding detail, as well as the artist' name, hometown, and avatar. Users can scroll normally, or use the navigation prompts.

SCENE

This is the AR screen. It is essentially the Map screen mentioned earlier, but render instead on a 3D Augmented Reality plane. Each point that renders is associated to a mural location stored within Redux. These points do not expand like in the 2D map, but they do change colors from red to blue once the user is within 20 feet from a particular point.

AI

This screen accesses the camera on the user's phone. When a user takes a picture of of a mural, the image is uploaded to our S3 bucket, then passes the image data through the Clarifai service. The Clarifai API exposes data that allows us to redirect the user to the appropriate mural in the 'Scroll' screen.

Clone this wiki locally