Showing posts with label iPad Applications. Show all posts
Showing posts with label iPad Applications. Show all posts

Saturday, June 18, 2011

Sync your desktop and phone with a single photo

I don't know about you, but I am forever pulling up Google Maps on screen, sending them to the printer...and then leaving for an assignment without the printouts. To the rescue comes an MIT artificial intelligence expert and a Google engineer who together developed an app that lets you move onscreen data like maps from your computer screen to a phone - and magically open the mapping program in the exact same state on the mobile device.


To use their forthcoming Deep Shot app, you simply point the phone camera at the PC or Mac screen and click the shutter. "The phone automatically opens up the corresponding application in the corresponding state. The same process can also work in reverse, moving data from the phone to a desktop computer," says MIT.
How? MIT's Tsung-Hsiang Chang and Google's Yang Li have written code that runs on both the computer and the phone. It makes visible onscreen the uniform resource identifier (URI), of which the web link, or uniform resource locator (URL), is a mere subset. Unlike an URL, the URI is the gobbledegook you get when you press the "link" button on a Google Maps or Street View page (hover your cursor over this link and look at the bottom corner of your screen to see it). This describes all the map data on the page and crucially also scales the data for the screen window.

Snap the screen and the URI code is recognised by the phone's app, calling up the mapping app program in the very same running state. It's very cool stuff, although not everyone is so impressed. But best of all, when Google decides to release the app, it'll save me a lot of wasted colour printouts.

Saturday, May 21, 2011

Aurasma app is augmented reality, augmented



Arriving a quarter hour early for a central London briefing this morning I decided to sit in leafy St James's Square, near Piccadilly, to scan the latest tweets. But before I could do so I spotted a man on a street corner staring intently at an iPad 2 he had trained on the gates to the square.
He showed me what he was watching: onscreen, Marilyn Monroe was apparently dancing in a bright yellow summer dress in the morning sunshine on the edge of the square before us. The guy was beaming at the sheer quality of the augmented reality imagery.


As you may have guessed, this iPad 2 user was the technology entrepreneur I had come to meet: Mike Lynch, co-founder and chief executive of Autonomy, the British software house. I wanted to hear about the augmented reality app his firm has just developed - in part because I couldn't fathom the link between AR and the firm's claim to fame to date: predictive software.

Autonomy has gone from nothing in 1996 to a firm worth £7 billion today by leveraging the theories of an 18th-century English mathematician and cleric called the Reverend Thomas Bayes, who worked out how to calculate the probability that certain variables are associated, whether they are words, behaviours or images. Lynch and his colleagues built their business on a pattern recognition engine called the Intelligent Data Operating Layer (Idol) that uses algorithms based on Bayes' ideas.

On the London tube, Autonomy's Bayesian algorithms that analyse CCTV images to calculate the likelihood someone will try to commit suicide by jumping under a train - allowing the track current to be turned off and help sought. If you follow Formula 1, Ross Brawn's Mercedes F1 team identifies the potential source of every advantage gained by rival teams by training Autonomy algorithms on post-race video. To prevent fraud or noncompliance with the financial laws, workplace emails are analysed to infer risk. And police can use Idol to seek hidden patterns in crime reports.

But that's in the PC world. Now, says Lynch, they want to exploit the awesome and growing power of smartphones like Android and iPhone/iPad. To do this they have written an app they've called Aurasma that allows anybody to associate real world items with online content, which they liken to an aura - hence the name.

"We won't be creating the content but we will be providing the infrastructure - including a 10,000-computer server farm - that allows it to be delivered to the real world," says Lynch.
The idea is that media companies can use Aurasma to relate printed matter - street posters, newspapers, magazines - to compelling video and online content they have made themselves or from TV stations and movie studios. Such use will require payments to Autonomy.

But for the rest of us, the service will be free: you can create your own content you'd like to relate to a place, a building or a park, say. And a social network will be built around this, too, allowing users to follow people whose environmental multimedia content they like.

To make it work, Autonomy's coders have rewritten their Bayesian algorithms for iOS and Android. Because Idol is a robust, probabilistic decision-making system, it means that users do not have to train their phone cameras on a flat, brightly-lit subject. The printed matter can be bent away from the camera at odd 3D angles, be dimly lit and yet still be recognised. And the probability calculation ensures that the displayed video stays withing the bounds of the matter being looked at. So a newspaper photo of David Beckham will pull up video of him playing for LA Galaxy that stays within the image frame on the newspaper's image.

Aurasma for print media hits the Apple Appstore next week, with a version for TV stations arriving in a month. Augmented reality is a hot field of endeavour and Autonomy will have its work cut out making a dent in this nascent field. Major publishers like Carlton Books are already shipping books that use it, for instance.
But Lynch is unruffled by the task. As I leave his office for the sunshine of St James's Square, it's clear he's particularly proud of having squeezed an 18th-century cleric into our pockets.

Paul Marks

Source New Scientist

Wednesday, May 18, 2011

Taking control of your data into your own hands

THE iPhone secretly tracks your location. Amazon has lost your files in the cloud. Hackers have stolen the details of 100 million customers from Sony. This string of revelations has left many people wondering who they can trust with their data.

Step forward David Wetherall at the University of Washington in Seattle and colleagues, who are developing tools to monitor the data transmission of apps and provide easy-to-understand "privacy revelations" about each one. "There is much value in simply revealing to users how they are being tracked," says Wetherall, who presented the concept at the HotOS conference in Napa, California, this week.

Whenever you sign up to a website or install an app, you are potentially giving the company behind the service access to your personal data - even if you don't realise it. Tech companies take steps to protect and inform their users about data usage, for example Apple vets iPhone apps sold through its store, and Google's Android lists the permissions granted to an app prior to installation. But Wetherall's team believes these don't go far enough.

The team gives the example of a sound-meter app funded by adverts, which has access to a phone's microphone to monitor sound levels and to the internet to download ads. Any app with these permissions can also record and upload sound without the user's knowledge, they say.

Tools to halt data leaks already exist, such as WhisperMonitor, an Android app released last week that allows users to monitor and prevent outbound traffic. But Wetherall's team wants to predict data leaks before they happen. To do so, they are developing an app that would run in the background on smartphones or browsers and analyse the flow of information, alerting users before an app tries to access data or pass it to other parties. Crowdsourcing user experiences could also help, allowing people who experience a leak to warn others against using an app.

Unknown data access is just one problem, however. Trusting companies to look after legitimately collected data is also a concern, as shown by the Sony customers who now find themselves at risk of phishing and other types of fraud. Millions more passwords were also put in danger last week when the online password manager service LastPass admitted it had suffered a potential data breach.

With password leaks now a regular occurrence, a switch to biometric "passwords" might be tempting. But a study due to be presented at the IEEE Symposium on Security and Privacy in Oakland, California, later this month suggests this can actually make a system less secure.

Lorie Liebrock and Hugh Wimberly at New Mexico Tech in Socorro asked 96 volunteers to create two user accounts, one secured by just a password, the other by a password and fingerprint reader. They found the passwords chosen for use with the fingerprint reader were 3000 times easier to break, potentially making the overall security of the system lower than simple password use alone.

As these latest leaks illustrate, believing others will keep your data secure can have disastrous consequences.

Source New Scientist

Thursday, May 5, 2011

Revolutionary new paper computer shows flexible future for smartphones and tablets

KINGSTON, ONTARIO – The world’s first interactive paper computer is set to revolutionize the world of interactive computing.

“This is the future. Everything is going to look and feel like this within five years,” says creator Roel Vertegaal, the director of Queen’s University Human Media Lab,. “This computer looks, feels and operates like a small sheet of interactive paper. You interact with it by bending it into a cell phone, flipping the corner to turn pages, or writing on it with a pen.”

Professor Roel Vertegaal's PaperPhone is best described as a flexible iPhone.
The smartphone prototype, called PaperPhone is best described as a flexible iPhone – it does everything a smartphone does, like store books, play music or make phone calls. But its display consists of a 9.5 cm diagonal thin film flexible E Ink display. The flexible form of the display makes it much more portable that any current mobile computer: it will shape with your pocket.

Dr. Vertegaal will unveil his paper computer on May 10 at 2 pm at the Association of Computing Machinery’s CHI 2011 (Computer Human Interaction) conference in Vancouver — the premier international conference of Human-Computer Interaction.
Being able to store and interact with documents on larger versions of these light, flexible computers means offices will no longer require paper or printers.
“The paperless office is here. Everything can be stored digitally and you can place these computers on top of each other just like a stack of paper, or throw them around the desk” says Dr. Vertegaal.
The invention heralds a new generation of computers that are super lightweight, thin-film and flexible. They use no power when nobody is interacting with them. When users are reading, they don’t feel like they’re holding a sheet of glass or metal.

An article on a study of interactive use of bending with flexible thinfilm computers is to be published at the conference in Vancouver, where the group is also demonstrating a thinfilm wristband computer called Snaplet.
The development team included researchers Byron Lahey and Win Burleson of the Motivational Environments Research Group at Arizona State University (ASU), Audrey Girouard and Aneesh Tarun from the Human Media Lab at Queen’s University, Jann Kaminski and Nick Colaneri, director of ASU’s Flexible Display Center, and Seth Bishop and Michael McCreary, the VP R&D of E Ink Corporation.

Source Queen's University

Saturday, April 30, 2011

Mike Matas: A next-generation digital book

Software developer Mike Matas demos the first full-length interactive book for the iPad -- with clever, swipeable video and graphics and some very cool data visualizations to play with. The book is "Our Choice," Al Gore's sequel to "An Inconvenient Truth."

Courtesy TED Talks