If you’ve had a Pokemon GO phase or been fooled by that PS5 AR filter on Instagram, you’ve seen augmented reality in action.
With AR technology accessible to us on our mobile devices, it’s no news that we can now virtually see what makeup looks like on us, or which furniture would look good in our homes with apps like IKEA Place and GIPHY World.
That’s the AR that most people know, where digital input is projected onto physical dimensions through a device's camera and responds to real time changes. But did you know this process could be reversed?
We can now bring physical objects into the digital world instead.
A while ago in October, this tweet by software designer Cyril Diagne caught the attention of many. In the video he posted, a user seamlessly “copies” and “pastes” physical objects into a computer screen with the ClipDrop app.
People were blown away by how simple the process was. Gone are the days we had to manually capture, transfer, and remove backgrounds on Photoshop. It all happens within seconds now.
(Image: ClipDrop on Youtube)
With this AR Copy Paste app, users are able to:
1. Clip surrounding objects with their mobile phones and paste them into another device on the same network.
2. Extract objects, people, drawings and text with a simple point and tap.
3. Instantly extract desktop objects and drop them to any other app.
4. Save cutouts to cloud for future use.
What goes behind the screen when you “copy” an object and “paste”? Diagne and Blanchet’s app uses patented and open source technologies to perform the two most important functions of the app — clipping and dropping.
Boundary-Aware Salient Object Detection (BASNet), a predict-refine architecture, is what detects, isolates and extracts the subjects you point your camera at.
Composed of an Encoder-Decoder network in charge of saliency prediction and a residual refinement module that handles saliency map refinement, this new hybrid loss effectively segments salient object regions and predicts fine structures with clear boundaries.
Scale-Invariant Feature Transform (SIFT), a patented algorithm on OpenCV, then enables the app to match the coordinates on your mobile phone with those on your desktop, and finally place extracted objects in specific positions on your computer screen.
For this to happen, the algorithm involves multiple steps: Scale-space construction, keypoint localization, orientation assignment, keypoint descriptor, and finally keypoint matching.
(Image: ClipDrop on Youtube)
ClipDrop beta is available on Android, iOS, macOS and Windows, currently offering 30 days of free cloud storage and 10 free clips.
Although this is just the beginning of AR innovation, I think we can all agree that this little peek is enough to give us an idea of how unimaginable the future of advanced technology is.
You could be the next one to open the world’s eyes.
Snappymob believes in going above and beyond. We’ve helped clients from startups to large corporations bring their ideas to life. Feel free to hit us up to discuss yours!
Snappymob is a passionate web and mobile app developer based in Kuala Lumpur, Malaysia. We have designed and developed awesome web and mobile applications for industries and companies around the globe.
We love our craft — the design, development, and the business of apps and this blog is our outlet for discussing what we think and sharing what we know with our community.