Applied Technotopia

We scan the digital environment to examine the leading trends in emerging technology today to know more about future.


We have added a few indices around the site. Though we look to the future, we need to keep an eye on the present as well:

Recent Tweets @leerobinsonp

Wearable tech and telekinetics merge in MindRDR in order to take and send photos from Google Glass with only your thoughts.

futurescope:

MindRDR - Brain Machine Interface connected with Google Glass

As exciting as this is:

MindRDR is a new free open source application which bridges the Neurosky EEG biosensor and Google Glass. It allows users to take photos and share them on Twitter and Facebook by simply using brainwaves alone. MindRDR was developed by London-based user experience company This Place.

I can’t fight the thought, that this is the result of an equivalent of the design process of The Homer.

Next step for the uberfuturegizmo: Add a drone with a depth sensor that records a thought controlled 3D image of yourself, which is instantly printed in your own mood-color on a wireless connected 3D-Printer.

[MindRDR] [h/t to Jeroen]

(via fuckyeahfutureshock)

An interesting clip of futurist, Christopher Barnatt, looking at lunar mining.

nosql:

Jason Brownlee put together a list of 7 machine learning books that make use of R:

In this post I want to point out some resources you can use to get started in R for machine learning.

Original title and link: 7 books for Machine Learning with R (NoSQL database©myNoSQL)

Simpler code has less bugs.

Remembering the Shuttle program.

historical-nonfiction:

Father and son at the first and last shuttle launches. Rest in peace, NASA Space Program.

This is a wonderful step in robotic guidance systems.

txchnologist:

University of California, Berkeley engineers are working on cooperative systems to control robots through complex environments. In this demonstration, a ground station uses computer vision to guide the group’s new 13-gram H2Bird ornithopter robot through a window.

The ground station, whose view we see in the second gif, uses real-time motion tracking over a live video stream to send steering guidance to the H2Bird micro air vehicle. With that information, the robot can successfully maneuver through a tight window frame. Their paper on the work is available hereRead More