Applied Technotopia

We scan the digital environment to examine the leading trends in emerging technology today to know more about future.



We have added a few indices around the site. Though we look to the future, we need to keep an eye on the present as well:

Recent Tweets @leerobinsonp
Posts tagged "video"

This paper microscope is simply brilliant design.

IF EVER a technology were ripe for disruption, it is the microscope. Benchtop microscopes have remained essentially unchanged since the 19th century—their shape a cartoonist’s cliché of science akin to alchemical glassware and Bunsen burners. And that lack of change has costs. Microscopes are expensive (several hundred dollars for a reasonable one) and need to be serviced and maintained. Unfortunately, one important use of them is in poor-world laboratories and clinics, for identifying pathogens, and such places often have small budgets and lack suitably trained technicians. (via Cheap microscopes: Yours to cut out and keep | The Economist)

A bit more on that railgun here.

I look at the US Navy`s railgun in action - sea trials to start in 2020 2016.

2020:

[The U.S. Navy’s] latest weapon is an electromagnetic railgun launcher. It uses a form of electromagnetic energy known as the Lorentz force to hurl a 23-pound projectile at speeds exceeding Mach 7. Engineers already have tested this futuristic weapon on land, and the Navy plans to begin sea trials aboard a Joint High Speed Vessel Millinocket in 2016.

http://www.wired.com/2014/04/electromagnetic-railgun-launcher/

[Haitus ended -sorry for the lull]. An interesting use of an Oculus Rift to control a drone.

prostheticknowledge:

OculusDrone

Hacking experiment from Diego Araos in controlling a camera-mounted ParrotAR drone with an Oculus Rift and head tracking - video embedded below:

Integrated Oculus Rift head tracking and video feed with the AR Drone to make a head motion controller. It’s really fun and latency very low.

This project is open source and a fork of another of my projects drone-swarm (to control several AR Drones within one network)

Source code can be found at GitHub here

(via we-are-star-stuff)

This clip and the related study offers an interesting look into future human-robot interaction.

futurescope:

Would You Do as a Robot Commands? An Obedience Study for Human - Robot Interaction

Snip from Fast Co:

In the future, we will have robot overlords. This uncomfortable experiment (captured in hilarious video) shows just how easily humans will roll over when we work for the machines.

University of Manitoba:

Would you do as a robot commands? Robots are beginning to play a larger role in society, finding their way into hospitals, the military, and our daily lives; it’s not too far off to think that they may one day be put in positions of authority over people. We know all to well the dark side of authority from classical psychology experiments such as the Milgram and Stanford Prison Experiments, but one question remains: can the authority figure effect apply to robots as well as people? As a preliminary study we decided to test this theory out; we had our robot pressure participants to continue a highly tedious (and unpleasant) task, and compared the results to having a human experimenter. Did they obey the robot? Check out the paper and project video to find out!

[read more] [University of Manitoba] [paper (pdf)]

An autonomous robot with an insectlike brain.

futurescope:

Conditioned behavior in a robot controlled by a spiking neural network aka robots with insect brains

From KurzweilAI:

German researchers have developed a robot that mimics the simple nervous system used for olfactory learning in the honeybee, using color instead of odors. The researchers have installed a camera on a small robotic vehicle connected to a computer. The computer program replicates, in a simplified way, the sensorimotor neural network of the insect brain and operates the motors of the robot wheels to control its motion and direction based on the colors.

Description from the FU Berlin Team on Youtube:

Here, we present a robotic platform designed for implementing and testing spiking neural network control architectures. We demonstrate a neuromorphic realtime approach to sensory processing, reward-based associative plasticity and behavioral control. This is inspired by the biological mechanisms underlying rapid associative learning and the formation of distributed memories in the insect.

Resources : 
iqr : http://iqr.sourceforge.net/?file=kop1…
Extended iqr modules: https://github.com/loairpa/iqrextensions

Additional material for the paper : 
Conditioned behavior in a robot controlled by a spiking neural network 
Lovísa Irpa Helgadottir , Joachim Haenicke, Tim Landgraf, Raul Rojas and Martin P Nawrot (Submitted 2013)

[read more]

Plans to grow stem cells in space may provide interesting results.