Last Updated: 26 Jan 2022 | 9 min read | Category: IoT/AR/VR |
Are you familiar with VrFace? It’s a library based on OpenCV that can simplify the steps developers need to take to build AR Android applications. It’s an exceptionally popular computer vision library that implements face detection strategies and dlib – another popular ML library that provides methods to detect the feature points of an individual’s face.
This topic is exclusively about creating Augmented Reality apps for Android devices by using an open-source library called VrFace.
If you hope to uncover useful information in this write-up, you need a basic understanding of the processes employed by the providers of android app development services to build an Android solution. Of course, you can always search the web for that purpose if you aren’t well aware of the intricacies associated with these projects.
After that, you should be able to guide your people in building an app within a short time that will allow users to apply masks on their faces in real-time. Additionally, such an app will even track the facial expressions of its users.
This topic will also focus on adding new effects based on the VrFace library through shaders. You’ll only get a short intro to the effects if you’re a first-timer. In the end, this topic will tell you a bit about how the library works internally.
Before delving into the details, here’s an explanation of why this library came into existence. About 5-6 years ago, an app came to iOS devices with the ability to incorporate facial effects in real-time.
Developers and communities from all four corners of the world found this particular quality of the app impressive. Unfortunately, there weren’t any apps on Android featuring similar qualities at that time.
Today, however, such specimens are available on Android devices, but there’s more than enough room for improvement. The process is difficult, especially for people with no professional experience in building something on Android.
So, how will you start this project? First of all, you’ll need to hire experienced developers offering android app development services. They must include the VrFace library as the project’s dependency. The format of the library is AAR, which is quite similar to an APK’s structure.
It has all the necessary resources, a native library built specifically for the “arm” architecture, and DEX packages. The combination of these three aspects should allow developers to build apps for almost every kind of Android device currently in existence.
In other words, your team of developers should include the dependency, provide configurations, initialize the library, and set up the screen layouts. The developers have to touch four specific files. The example repository has everything in it already. Your employees or third-party service providers will fork and clone the repository to simplify the process.
To start with your developers should denote a maven dependency in “build.gradle:”
Then, they must add credentials in the “gradle.properties:”
If you want to use a token from GitHub, ask your developers to go to their “account settings” to create a token with “read:package” permission. You’ll find more information here about credentials. Once your developers add this configuration, they can download the library from the maven repository when they start working on your project.
The VrFace library needs an external model to pinpoint and trace sixty-eight feature points. As the model is about 60 MB in size, the library doesn’t include the model. Your developers must download it separately and unzip the same by running this code.
bzip2 –d shape_predictor_68_face_landmarks.dat.bz2
Immediately after entering the code mentioned above, they will gain access to the “shape_predictor_68_face_landmarks.dat” file. They must rename the file to “sp68.dat” and move the same to the directory “app/src/main/assets/”
Now, it’s time for you to take a look at the way your AR/VR app development company will initialize the library. Your developers can do it by entering the “MainActivity” class that initializes all the layouts, provides the necessary configurations and loads the library.
When it comes to initializing the library, they have to utilize the OpenCV callback. They’ll call it only after OpenCV loads its exclusive native library. It’s the best place to load the native library for VrFace. According to experts, this library bears the name “detection_based_tracker” for historical reasons.
If you expect your users to be able to use the camera of their device, you need your developers to add the “FastCameraPreview” to “layout.xml.”
It will make sure that you get the frames from the camera in the format needed by the library.
Apart from that, your developers should also be able to specify the view element for the result of applying the desired effect to the camera preview.
The developers of the AR/VR app development company called Moon Technolabs have to extend the “ShaderEffect” class to add effects to the application. They have to do it exactly the way it’s there in the “ShaderEffectMask.”
However, this particular class uses just one effect with which you can apply a mask to a three-dimensional face. If you hope to add more effects, you have to gather more information on shaders first.
Shaders are the scripts processed by the device’s GPU, along with input data coming from the textures, such as an image from the camera or other pictures you want to apply to a 3D figure.
Once your developers take all the steps elucidated here, they can build the app and launch it on an Android phone. During the initial launch, the app will take about thirty seconds to initialize the library and the model. After that, it’ll apply the effects.
Here’s a prebuilt APK binary for you.
The builders of the VrFace library used a specific component of Java that functions with the camera to create it. They also included shades they can use to apply effects and a native library.
The native library has four vital parts. These include the camera positioning, the OpenCV library, the C++ dlib library, and a few extra methods of finding facial expressions using 3D models.
It’ll be a challenge even for the best developers to write such a library. Apart from being difficult, it’s unnecessarily time-consuming. That’s why it’s better for you to stick to the basic steps only.
Your team of developers has to use the android.hardware.Camera package. In the beginning, your developers have to determine the number of cameras available on the device by running the code “Camera.getNumberOfCameras().” After that, they have to get a handle to the camera required for the purpose.
Also, your developers should configure the preview parameters.
When they find the list of the preview sizes available, they need to select the most appropriate one based on the screen size.
The next complicated task incorporates starting the previewing. Your team has to build a buffer and set up a preview callback.
There’s only one method your developers have to implement for the preview.
“void onPreviewFrame(byte data, Camera camera).”
In this instance, “data” is the preview frame from the device’s camera.
The preview frame adheres to the NV21 format. It means that you have to break it into two parts with one image in grey and the other in colored format. This concept is important as the next two steps will use the grey image.
If you hire android app developers, they can use C/C++ native code. To do that, your developers have to provide a native interface implementation in a Java class. Here’s an example.
“private static native void nativeDetect(long thiz, long inputImage, long faces);”
It works as the bridge connecting native code with Java. It declares a function created using C/C++ using the code “JNIEXPORT:”
The final step incorporates configuring the build script for native code with Android NDK builder.
There’s a lot more for you to learn about using VrFace library to build an AR application for Android devices. Then again, you’re going to hire android app developers from a recognized agency like Moon Technolabs. The people working there are perfectly aware of VrFace and how to use it to build AR apps for Android devices.
This topic attempted to explain the process of building an app that can create facial effects for Android gadgets with the VrFace library. To complete this project, developers have to add the library as a dependency to the project file while carrying everything through all the necessary initializations and configurations.
Their team did a great job of managing the timeline and communicating their progress throughout the project. They were accommodating in the face of unexpected changes and delivered all key features. The updated app made it easier to track budgets and improved the finance department’s efficiency.
The new site is high functioning and has experienced an increase in users. The Moon Technolabs Pvt Ltd team’s availability made the collaboration effortless and productive. They fostered a professional environment and produced a final product with no major problems.
The team delivered top-tier apps that garnered positive feedback from users and had minimal issues. They excelled at project management and were committed to surpassing expectations.
The enhanced system eliminated delays, streamlined reporting processes, and simplified every department’s workflow. Moon Technolabs Pvt Ltd communicated clearly and delivered each task on time. Their prompt, thorough approach kept the project on track.
It was a great experience to working with Moon Technolabs.We have worked on serveral Android App development Projects over the past year and are currently working on the iOS version of one of them. The team is very professional and responsive. In particular, they follow good project management practices,assigning ,me to a Project Manager, Who has been the single…
WThe client was pleased with the quality of final product, which was delivered on time. Moon Technolabs was responsive throughout the project.
The client has been happy with both the web design and marketing services provided. The time difference and slight language barrier have been a modest hindrance to collaboration, but have not impacted overall satisfaction.
Moon Technolabs have helped me in designing & developing this attractive iPad/iPhone Coloring book app and have supported me from scratch till uploading the app on App stores. I am highly recommending them for the best company in iPhone and iPad app.
The project timeline and estimates were on point and the end client was ultimately happy with the product.
The team often went further than was expected and needed.
The updated app has yet to launch, but Moon Technolabs Pvt Ltd met every milestone ahead of time without sacrificing quality. Customers can expect an experienced team that provides speedy service and prompt responses to questions or issues. Their timeliness despite time zone differences stood out.