I’ve just added a project called AR Maintenance to my Portfolio.
What is AR Maintenance?
A prototype of an augmented reality application for guiding users in the performance of maintenance tasks, built as a team assignment for an undergraduate course in Software Development Practices.
I worked on this project as part of a team of 6 students to meet requirements specified in a proposal assigned to us by our instructors.
The goal was to develop a prototype for an iPad application capable of displaying instructions to guide workers step-by-step in the completion of various maintenance and repair jobs. After finishing the current step, users would be able advance the current job to the next instruction. An AR camera was to recognize real-world objects associated with each instruction and highlight them with computer-generated objects.
The proposal also outlined a feature that would allow users to videoconference with a support representative if additional assistance was required. This feature later proved impractical to include in the time frame allotted to us, so we implemented a simple placeholder instead.
We began development using React Native. We initially planned to develop the project as a mobile app using the iPad camera’s standard functionality in place of the AR component, which we would add later once we figured out how to do so.
Just before the Spring Break holiday, we gave a class presentation to demonstrate our progress. The instructors strongly advised us to change our focus to the AR functionality, as we had yet to get it working.
After much consideration, we decided the best way to move forward was to abandon our work in React Native. Instead, we built the final prototype in about 6 1/2 weeks using the Unity game development platform and the Vuforia SDK, with programming done in C#.
What I Contributed to the Project
When we scrapped our React Native project, we agreed to spend Spring Break researching different AR platforms, then meet to decide which would best serve our needs.
I spent the week going through some of the tutorials in a Udemy course using Unity and Vuforia. Then I applied what I learned to build a basic prototype with these features:
- A model representing a “job” as a series of connected scenes, each with an AR camera, AR objects, and a single text instruction
- An example job with five scenes
- A 2D graphic printed on a sheet of paper to act as an Image Target for the AR cameras
- Upon recognition of the Image Target, different AR objects became active depending on the current scene
- A simple Home screen that allowed users to start the example job
- Navigation buttons to move forward to the next scene or backward to the previous one
- A button for returning to the Home screen
At the next meeting, I demonstrated my work, which the team decided to adopt for further development.
In addition to this effort, I contributed a menu system that requests user confirmation before completing a job, returning to the Home screen, or calling for expert assistance. I also worked on some bug fixes and general polish.
What Others Contributed to the Project
My initial prototype included rough features that my teammates later replaced with more polished versions.
For example, user instructions were first implemented as static text objects placed in each scene using the Unity editor. To improve our ability to add and edit text for the UI, two team members incorporated the SQLite database engine into the project, which dynamically loads user instructions and confirmation requests as required.
Setting up the navigation buttons for a new job was originally a bit cumbersome because it involved editing properties on the Forward and Back buttons in the Unity editor for each scene. A team member streamlined this process and made it less prone to human error by implementing a more robust solution in code.
Another team member was able to use the Vuforia Object Scanner to put together an example job focused on a real-world object, rather than a printed 2D graphic like the one I used. He greatly improved the UI with a new Home screen and Job Category screens, including scrollable areas capable of displaying multiple buttons for launching jobs. He also produced art assets for the UI.
Additionally, team members contributed automated unit tests, various incremental improvements, and bug fixes.
When we were assigned this project, only one or two of us had any experience with mobile development, and none of us knew anything about working with AR. We decided to focus on figuring out how to develop a mobile application using React Native, leaving the AR component for later.
As I mentioned earlier, this decision didn’t work out, and we abandoned React Native in favor of Unity and Vuforia. Although most of us were unfamiliar with these platforms, we found it relatively easy to start making progress with them, and we were able to deliver a satisfactory project for our final grade.
The decision to switch to Unity appealed to us partly because of the Vuforia Object Scanner, which runs on Android smartphones and scans small items for use as Object Targets for an application’s AR cameras. We hoped to scan items relevant to maintenance work such as tools, bolts, spark plugs, and batteries.
In practice, we found the Scanner didn’t work very well for the objects we wanted to use, so ultimately we delivered less than what we wanted. Fortunately, a team member was able to get a reliable scan of a small cardboard box originally used for plastic sandwich bags. He put together an example job around it, so we were able to demonstrate our application’s ability to recognize a real-world object.
About the Portfolio Version
I’ve modified the project in two ways for the purpose of including it in my portfolio.
During development, we were never able to get SQLite working for Android builds, which meant that instruction text and confirmation requests wouldn’t load on Android devices. Since the database functioned properly on an iPad, our target platform, we didn’t address it.
I wanted to include a functioning Android version in my portfolio, so I looked into the issue. After trying several potential solutions without success, I worked around it by hard-coding all text used by the UI.
During testing, I also found that buttons contained inside scrollable areas were unreliable on my phone, which made it difficult to activate the AR component. I resolved the issue by removing the scrollable areas for this version of the application.