Wednesday, November 5, 2014

Microscale 3-D Printing

Despite the excitement that 3-D printing has generated, its capabilities remain rather limited. It can be used to make complex shapes, but most commonly only out of plastics. Even manufacturers using an advanced version of the technology known as additive manufacturing typically have expanded the material palette only to a few types of metal alloys. But what if 3-D printers could use a wide assortment of different materials, from living cells to semiconductors, mixing and matching the “inks” with precision?

Jennifer Lewis, a materials scientist at Harvard University, is developing the chemistry and machines to make that possible. She prints intricately shaped objects from “the ground up,” precisely adding materials that are useful for their mechanical properties, electrical conductivity, or optical traits. This means 3-D printing technology could make objects that sense and respond to their environment. “Integrating form and function,” she says, “is the next big thing that needs to happen in 3-D printing.”
                                                                                                   
 Left: For the demonstration, the group formulated four polymer inks, each dyed a different color. 

Right: The different inks are placed in standard print heads. 

Bottom: By sequentially and precisely depositing the inks in a process guided by the group’s software, the printer quickly produces the colorful lattice.



A group at Princeton University has printed a bionic ear, combining biological tissue and electronics (see “Cyborg Parts”), while a team of researchers at the University of Cambridge has printed retinal cells to form complex eye tissue. But even among these impressive efforts to extend the possibilities of 3-D printing, Lewis’s lab stands out for the range of materials and types of objects it can print.

Wednesday, October 15, 2014

Why QHD is the future of smartphones?

2014 is shaping up to be the year of QHD displays on our smartphones. The Oppo Find 7 and Vivo Xplay 3Ssignalled the start, and now the Chinese manufacturers have been joined by the Korean mobile giant of LG and the LG G3.
Quad HD (QHD) comes with four times more pixels than a 720p HD display and it's the next step up from full HD (1920 x 1080) with a 2560 x 1440 resolution.
Of course 4K still resides above QHD, but that tech is only just making its way onto high-end TVs and computer monitors so we're unlikely to see it hit our smartphones this year.
While the likes of the Samsung Galaxy S5 and HTC One M8 launched with full HD screens there are rumors pointing towards QHD reboots of both in the form of the Galaxy S5 Primeand One M8 Prime.
Looking back mobile displays have dramatically increased both in size and resolution whilst managing to keep battery life fairly impressive; just take a look at HTC's One M8.
Screens have grown from the 3.7-inch, 480 x 800 (252 ppi) Google Nexus One of 2010 to the full HD displays of the 5.1-inch (432ppi) Galaxy S5 and 5.2-inch (424ppi) Sony Xperia Z2 in 2014.
The difference that four years has made is nothing short of astounding.
It was only a matter of time before manufacturers looked for the next step in screen technology, so the arrival of the 538ppi, 5.5-inch QHD LG G3 and reports of potential M8 Prime and S5 Prime handsets are hardly a surprise.

Why oh why?

Why are mobile manufacturers so interested in QHD displays at the moment? According to Michelle Leyden Li, Qualcomm's Senior Director of Marketing, the answer is simply 'us'.
"Consumers never seem to be satisfied and people want more and more things on their devices," Li explains.
"People are using their phones more for movies, TV, video... it's their own personal device and they want a beautiful experience."
It is that experience that is driving consumers to the high end flagship devices, with Apple's Retina display causing excitement when it was released on the iPhone 4.
There's always an appetite for ever brighter and more high resolution displays, as it leads to consumers feeling like they're getting a next generation of technology for their money.
iPhone 4
As consumers we might be the ones pushing for increased resolutions, but is it really in our best interests? According the Huawei boss Richard Yu the answer is a very clear no.
"I don't think we need QHD displays on mobiles. Your eyes totally cannot identify between full HD and 2K on a smartphone. You can't distinguish the difference, so it's totally nonsense," Yu told us at the launch of the firm's Ascend P7 flagship.
"We can [put a QHD display on a smartphone], but it's very bad for power consumption and it doesn't offer anything in return."
LG doesn't agree with Mr Yu - shock horror - with the QHD display providing the same resolution we're accustomed to in high resolution printed magazines and art books.
"We don't want to compete with someone else in the digital world [when it comes to displays]. LG is known for creating beautiful displays for years, and with the QHD screen we're trying to end that battle," explained Dr Ramchan Woo, head of smartphone development at LG.
LG G3
The need for QHD on the G3 came from LG's research in the print industry, where it found that high res art books have pages equating to around 540ppi.
That pixel density can be matched by QHD displays, and LG claims it also offers improved colour reproduction and sharpness when viewing text.

Friday, August 1, 2014

Samsung Galaxy Alpha With Octa-Core SoC to Launch on August 13

The rumour mill has been spewing out several conflicting launch dates for the anticipated metal-clad Samsung Galaxy Alpha (aka Galaxy S5 Prime and Galaxy F). First a report claimed the August 13 launch, then on Wednesday it was rumoured for August 4, and now yet another report claims the launch of the rumoured Galaxy Alpha will be on August 13 itself.
On Thursday a German website named Allaboutsamsung had published a report scrapping the August 4 release date from Sammobile, and claimed the rumoured metal-clad Samsung Galaxy Alpha smartphone will launch on August 13. The website also published purported specifications of the handset, along with an image of the alleged Samsung Galaxy Alpha's back without the panel (showing the SIM card slot and model number, as well as visible metal seams), as well as purported Galaxy Alpha AnTuTu benchmark screenshots.



The purported image and benchmark screenshots of the Galaxy Alpha shows the devices' model number to be SM-G850F. It appears that the smartphone will come with support for only a single SIM (Nano-SIM), and will not support storage expandability via microSD card. Notably, the website claims it is unsure about if the smartphone seen in the image is the final design that will appear on shelves, and that it is waiting for Samsung to launch the device on August 13.The purported Galaxy Alpha AnTuTu benchmark screenshots reveals a 720x1280 pixel display resolution with a pixel density of 320ppi for the rumoured 4.8-inch screen. The other leaked specifications of the anticipated Galaxy Alpha include an octa-core Exynos 5433 SoC with a Mali-T628 GPU; 2GB of RAM; 32GB of inbuilt storage; a 12-megapixel rear camera; a 2.1-megapixel front camera, and Android 4.4.4 KitKat.
Earlier this week, the alleged metal-clad Samsung Galaxy Alpha was purportedly spotted in a user agent (UA) profile on Samsung's support site, as well as on Samsung's developer console, once again pointing to a 720p display resolution.





Friday, June 13, 2014

Apple's 'transparent texting' tech lets iPhone users safely message while moving

To enable a "transparent texting" system, Apple proposes that an app's background be modified to display video images continuously captured by an iPhone's rear-facing camera, according to a patent application filed with the U.S. Patent and Trademark Office. 
Due to their inherently mobile nature, smartphones are often used while moving. This is fine for voice calls, but could be problematic for operations that demand visual attention like reading or writing text messages. Aside from appearing antisocial, texting could potentially cause bodily harm if a user operates their device while walking. 



If smartphones were to have a transparent display, or a system that offers the illusion of transparency, users would be more aware of their surroundings. 
The implementation as described by Apple is quite simple. A device uses its rear-facing camera to continuously capture video and present the images as a background within a text messaging app currently being displayed. The onscreen result would be offer the illusion of a transparent display with floating text. 
In one embodiment, the live video background is displayed behind the usual dynamic user interface seen in iMessage, complete with colored bubbles denoting a chat session between two or more people. These bubbles may be opaque or, in some cases, partially transparent to allow greater visibility of the live background. 



The system can be activated via an in-app button that transforms the GUI from the normal white background to a live video version. Extending the application beyond messaging apps, the live video feed can be used in other situations requiring a significant amount of visual concentration. For example, the implementation may be an option in the use of mobile Web browsers, where text and images would float over the live-view background. Another embodiment covers e-book readers such as Apple's iBooks.
While not a completely foolproof system (users must point the iPhone camera straight ahead while walking for full visibility), Apple's invention proves the company is actively investigating unique ways to leverage existing hardware technology for the purpose of enhancing the mobile device experience.It is unknown if Apple is planning to work such a feature into its next iOS build, but the tech required to enable similar functionality is already in place. A transparent texting window could even be considered a good fit with the new "flat," layered iOS 7 design aesthetic. 



Sunday, June 1, 2014

Web Animations - element.animate() is now in Chrome 36

Web Animations - element.animate() is now in Chrome 36

Animation on the web was once the province of JavaScript, but as the world moved to mobile, animations moved to CSS for the declarative syntax and the optimizations browsers were able to make with it. With 60fps on mobile always your goal, it makes sense to never step outside of what browsers know how to efficiently display.
More tools are appearing to make JavaScript-driven animations more efficient, but the holy grail is a unification of declarative and imperative animations , where the decision of how to write your animations is based on what’s the clearest code, not what is possible in one form and not in the other.
Web Animations stand to answer that call, and the first part of it has landed in Chrome 36 in the form of element.animate(). This new function lets you create an animation purely in JavaScript and have it run as efficiently as any CSS Animation or Transition (in fact, as of Chrome 34, the exact same Web Animations engine drives all of these methods)
The syntax is simple, and its parts should be familiar to you if you’ve ever written a CSS Transition or Animation:
element.animate([
  {cssProperty: value0},
  {cssProperty: value1},
  {cssProperty: value2},
  //...
], {
    duration: timeInMs,
    iterations: iterationCount,
    delay: delayValue
});
The biggest advantage of this new function is the elimination of a lot of awkward hoops we formerly had to jump through to get a smooth, jank-free animation.
As an example, for Santa Tracker last year, we wanted to have snow falling continuously, and we decided to animate it via CSS so that it could be done so efficiently.
However, we wanted to pick the snow’s horizontal position dynamically based on screen and events going on in the scene itself, and of course the height of the snow’s fall (the height of the user’s browser window) wouldn’t be known until we were actually running. This meant we really had to use CSS Transitions, as authoring a CSS Animation at runtime gets complex quickly (and hundreds of snowflakes means hundreds of new styling rules).
So we took the following approach, which should be familiar:
snowFlake.style.transform = 'translate(' + snowLeft + 'px, -100%)';
// wait a frame
snowFlake.offsetWidth;
snowFlake.style.transitionProperty = 'transform';
snowFlake.style.transitionDuration = '1500ms';
snowFlake.style.transform = 'translate(' + snowLeft + 'px, ' + window.innerHeight + 'px)';
The key is in that 'wait a frame’ comment. In order to successfully start a transition, the browser has to acknowledge that the element is in the starting position. There are a few ways to do this. One of the most common ways is to read from one of the element properties that forces the browser to compute layout, thereby ensuring it knows that the element has a starting position before transitioning to the ending position. Using this method allows you to congratulate yourself on your superior knowledge of browser internals while still feeling dirty with every keystroke.
In contrast, the equivalent `element.animate()` call couldn’t be more clear, saying exactly what is intended:
snowFlake.animate([
  {transform: 'translate(' + snowLeft + 'px, -100%)'},
  {transform: 'translate(' + snowLeft + 'px, ' + window.innerHeight + 'px)'}
], 1500);
There are many more options. Just like with its CSS counterparts, Web Animations can be delayed and iterated:
snowFlake.animate([
  {transform: 'translate(' + snowLeft + 'px, -100%)'},
  {transform: 'translate(' + snowLeft + 'px, ' + window.innerHeight + 'px)'}
], {
  duration: 1500,
  iterations: 10,
  delay: 300
});

Thursday, May 15, 2014

YouTube, Facebook Account for Nearly a Third of Mobile Traffic

Facebook and YouTube are now dominating mobile traffic shares in early 2014, as more people shift to a mobile device to upload photos to social networks and watch cat videos.Facebook and YouTube now account for 32% of data sent to and from mobile devices, according to a report by Sandvine. Individually, Facebook's share was 26.9% for upstream traffic and had a 14% share for downstream traffic during peak periods in North America through the beginning of this year, while YouTube only had 3.7% share for upstream traffic, but a 17.6% share for downstream.With Facebook's high upstream traffic, it seems users are uploading photos and videos from mobile devices more than ever before on the social network. YouTube's downstream traffic share is essentially unchanged from the Sandvine's number from last year, 17.7%.
The following chart, created by Statista, lists the top 10 web services ranked by mobile traffic share. 

Wednesday, May 7, 2014

Adobe Launches Standalone Storytelling App for iPad

Many of Adobe's mobile offerings have been extensions of the company's popular software products, such as Photoshop or Lightroom.
Increasingly, however, Adobe is focusing on providing more tools for its mobile users. The company rolled out Adobe Voice Thursday, an all-new storytelling app for iPad users.
SEE ALSO: The 25 Best Free iPad Apps
The app gives users a way to easily create and share animated videos that combine images, music, voice recordings and special effects.
"Adobe Voice puts the power of Creative Cloud's industry-leading video and audio technology into the hands of the masses," Winston Hendrickson, Adobe's vice-president of products for Creative Media Solutions, said in a release.
The app is meant be simple to use, and provide people — especially those who may not be able to use traditional video-editing software — with ways to tell stories through video.
Adobe Voice app screen
Users begin by choosing from a preset story template that helps outline the structure of their video. They can then import their own photos, draw from the app's library of 25,000 icons or search the web for their own images.
Adobe also included specially created soundtracks, so users can add music to fit the story. Users can also record their own voiceovers.
The company said it envisions the app to be particularly useful in schools, where students and teachers may want to create interactive videos, but don't necessarily have the knowledge or resources to use conventional video software. Adobe even had students and teachers test the app during its beta period.
News of Adobe Voice first surfaced last fall, when images of the app's private beta, codenamed "Ginger," leaked online.