Thursday 10 November 2016

Vine may survive

Twitter plan to sell vine

Vine may survive after all. Twitter is currently vetting multiple term sheets from companies offering to buy Vine, and hopes to make a deal soon, multiple sources tell TechCrunch. After announcing its plan to shut down Vine last month, Twitter received a large number of bids, including several from Asia. It’s now working to decide who should run the short-form video app.
While TechCrunch couldn’t confirm the names of any of the companies interested in Vine, a rumored bidder was Japanese messaging and gaming company LINE. [Update: We’ve learned that Twitter has narrowed the pool from more than 10 bidders to around 5.]
Twitter announced on October 27th that it planned to eventually shut down Vine, but keep the archive of Vines playable and allow users to download their content. But beyond clarifying some of the original announcement in an FAQ, it hasn’t made any further announcements about the future of Vine, which makes sense because it’s in active talks to sell the app.
One source says that at least some of the offers are for less than $10 million, indicating Twitter might not generate significant revenue directly from selling Vine.

However, Vine could still benefit Twitter even if it’s owned by someone who would help it thrive and retain the strong integration between the two apps. Vine content plays instantly in the Twitter stream, bolstering its current parent company’s quest to serve more video that could attract user engagement.
If Vine doesn’t shut down and creators keep producing the six-second clips, Twitter could earn money from sponsored content deals arranged byNiche, the social media talent broker startup that Twitter acquired in 2015.
Recode reported in September that Twitter might look to do something different with Vine. The New York Times wrote in early October that while Twitter was trying to sell itself, it was also looking to get rid of Vine to cut costs. It later reported that Vine was costing $10 million a month to run in infrastructure and employees, and Twitter had explored selling it off.
After announcing the eventual shut down, selling Vine seemed to be off the table. But TechCrunch has learned that the level of support and mourning for Vine from the internet community boosted the interest of potential acquirers willing to endure the operation costs to own one of the most culturally impactful video platforms. While Vine had been rapidly losing users and star creators, plenty of people wanted to see it live on and were angry Twitter planned to kill off the app.

Thursday 3 November 2016

Android Nougat Arrives


Google has begun rolling the latest version of Android, the operating system running most smart phones, not to mention a lot of TVs and other devices. Of course, many existing devices won’t get the new version. But users of recent model Nexus phones will be the first to receive Android version 7.0, otherwise known as Nougat, with millions of others likely to follow.
Nexus phones run plain Android, without the skins and ‘improvements’ laid over the top by most major phone producers, so Google can deliver a new version to them without complications. Dave Burke, Google’s Vice President of Engineering, says that while the roll out to Nexus devices is happening, the company is “pushing the Android 7.0 source code to the Android Open Source Project (AOSP), extending public availability of this new version of Android to the broader ecosystem.”
Translation: it is being made available to the other phone makers so that they can work out how to include it in their phones. The major phone makers are likely provide updates for their current premium models – these are typically running Marshmallow, Android 6.0.1 – within a few months once they have integrated their proprietary interfaces with it and are sure it will work properly.
Check out the new look of android Nougat https://plus.google.com/+RohanBlake/posts/KgMqN35m11j

Wednesday 2 November 2016

How Nanoscience Will Improve Our Lives in the Coming Years


In a newly published study, nanoscientists look ahead to what we can expect in the coming decade, and conclude that nanoscience is poised to make important contributions in many areas, including health care, electronics, energy, food and water.

Nanoscience research involves molecules that are only 1/100th the size of cancer cells and that have the potential to profoundly improve the quality of our health and our lives. Now nine prominent nanoscientists look ahead to what we can expect in the coming decade, and conclude that nanoscience is poised to make important contributions in many areas, including health care, electronics, energy, food and water.

Friday 21 October 2016

Heme Molecule May Be The Key To More Effficient Batteries


New research from Yale University shows that a molecule that transports oxygen in blood could be key to developing the next generation of batteries.
Lithium-oxygen (Li-O2) batteries have emerged in recent years as a possible successor to lithium-ion batteries — the industry standard for consumer electronics — due to their potential for holding a charge for a very long time. Electronic devices would go for weeks without charging, for instance; electric cars could travel four to five times longer than the current standard.
But before this could happen, researchers need to make the Li-O2 batteries efficient enough for commercial application and prevent the formation of lithium peroxide, a solid precipitate that covers the surface of the batteries’ oxygen electrodes. One obstacle is finding a catalyst that efficiently facilitates a process known as oxygen evolution reaction, in which lithium oxide products decompose back into lithium ions and oxygen gas.
The Yale lab of Andre Taylor, associate professor of chemical and environmental engineering, has identified a molecule known as heme that could function as a better catalyst. The researchers demonstrated that the heme molecule improved the Li-O2 cell function by lowering the amount of energy required to improve the battery’s charge/discharge cycle times.
The lead author is Won-Hee Ryu, a former postdoctoral researcher in Taylor’s lab, who is now an assistant professor of chemical and biological engineering at Sookmyung Women’s University in South Korea.
The heme is a molecule that makes up one of the two parts of a hemoglobin, which carries oxygen in the blood of animals. Used in an Li-O2 battery, Ryu explained, the molecule would dissolve into the battery’s electrolytes and act as what’s known as a redox mediator, which lowers the energy barrier required for the electrochemical reaction to take place.
“When you breathe in air, the heme molecule absorbs oxygen from the air to your lungs and when you exhale, it transports carbon dioxide back out,” Taylor said. “So it has a good binding with oxygen, and we saw this as a way to enhance these promising lithium-air batteries.”
The researchers added that their discovery could help reduce the amount of animal waste disposal.
“We’re using a biomolecule that traditionally is just wasted,” said Taylor. “In the animal products industry, they have to figure out some way to dispose of the blood. Here, we can take the heme molecules from these waste products and use it for renewable energy storage.”
Ryu noted that by using recyclable biowaste as a catalyst material, the technology is both effective and could be preferential in developing green energy applications.

Wednesday 12 October 2016

New Beaver-Inspired Wetsuits May Help Keep Surfers Warm


Inspired by semiaquatic mammals such as beavers and sea otters, MIT engineers are fabricating fur-like rubbery hair-lined wetsuits that may help keep surfers warm.
Beavers and sea otters lack the thick layer of blubber that insulates walruses and whales. And yet these small, semiaquatic mammals can keep warm and even dry while diving, by trapping warm pockets of air in dense layers of fur.
Inspired by these fuzzy swimmers, MIT engineers have now fabricated fur-like, rubbery pelts and used them to identify a mechanism by which air is trapped between individual hairs when the pelts are plunged into liquid.
The results, published in the journal
Physical Review Fluids , provide a detailed mechanical understanding for how mammals such as beavers insulate themselves while diving underwater. The findings may also serve as a guide for designing bioinspired materials —
When the group returned from the trip, Hosoi assigned the problem to Nasto, encouraging her to find examples in nature that could serve as a design model for warm, dry, streamlined wetsuits. In her literature searches, Nasto zeroed in on semiaquatic mammals, including beavers and sea otters. Biologists had observed that these animals trap, or “entrain” air in their fur.
Nasto also learned that the animals are covered in two types of fur: long, thin “guard” hairs, that act as a shield for shorter, denser “underfur.” Biologists have thought that the guard hairs keep water from penetrating the underfur, thereby trapping warm air against the animals’ skin. But as Nasto notes, “there was no thorough, mechanical understanding of that process. That’s where we come in.”
Deep pockets
The team laid out a plan: Fabricate precise, fur-like surfaces of various dimensions, plunge the surfaces in liquid at varying speeds, and with video imaging measure the air that is trapped in the fur during each dive.
To make hairy surfaces, Nasto first created several molds by laser-cutting thousands of tiny holes in small acrylic blocks. With each mold, she used a software program to alter the size and spacing of individual hairs. She then filled the molds with a soft casting rubber called PDMS (polydimethylsiloxane), and pulled the hairy surfaces out of the mold after they had been cured.
In their experiments, the researchers mounted each hairy surface to a vertical, motorized stage, with the hairs facing outward. They then submerged the surfaces in silicone oil — a liquid that they chose to better observe any air pockets forming.
As each surface dove down, the researchers could see within the hairs a clear boundary between liquid and air, with air forming a thicker layer in hairs closer to the surface, and progressively thinning out with depth. Among the various surfaces, they found that those with denser fur that were plunged at higher speeds generally retained a thicker layer of air within their hairs.
Fur trap
From these experiments, it appeared that the spacing of individual hairs, and the speed at which they were plunged, played a large role in determining how much air a surface could trap. Hosoi and Nasto then developed a simple model to describe this air-trapping effect in precise, mathematical terms. To do this, they modeled the hair surfaces as a series of tubes, representing the spaces between individual hairs. They could then model the flow of liquid within each tube, and measure the pressure balance between the resulting liquid and air layers.
“Basically we found that the weight of the water is pushing air in, but the viscosity of the liquid is resisting flow (through the tubes),” Hosoi explains. “The water sticks to these hairs, which prevents water from penetrating all the way to their base.”
Hosoi and Nasto applied their equation to the experimental data and found their predictions matched the data precisely. The researchers can now accurately predict how thick an air layer will surround a hairy surface, based on their equation.
“People have known that these animals use their fur to trap air,” Hosoi says. “But, given a piece of fur, they couldn’t have answered the question: Is this going to trap air or not? We have now quantified the design space and can say, ‘If you have this kind of hair density and length and are diving at these speeds, these designs will trap air, and these will not.’ Which is the information you need if you’re going to design a wetsuit. Of course, you could make a very hairy wetsuit that looks like Cookie Monster and it would probably trap air, but that’s probably not the best way to go about it.”
José Bico, a lecturer at ESPCI (the City of Paris Industrial Physics and Chemistry Higher Educational Institution) in Paris, points to another application for the group’s results: the process of industrial dip-coating, by which surfaces are dipped in polymer to achieve an even, protective coating.
“Air or liquid entrainment is a big deal in a lot of industrial coating applications,” says Bico, who was not involved in the research. “For instance, many treatments involve dipping of an object in a bath of some liquid. In that case, you do not want air to remain trapped. This model tells how fast one may [dip] before trapping air.”

Tuesday 11 October 2016

Transparent Memory Chips – The Next Step in Memory Storage




As technology moves forward, things get smaller, faster and now possibly transparent. A team of scientists have developed transparent, flexible memory chips that may one day replace flash drives and other personal data storage devices.
New memory chips that are transparent, flexible enough to be folded like a sheet of paper, shrug off 1,000-degree Fahrenheit temperatures — twice as hot as the max in a kitchen oven — and survive other hostile conditions could usher in the development of next-generation flash-competitive memory for tomorrow’s keychain drives, cell phones and computers, a scientist reported today.
Speaking at the 243rd National Meeting & Exposition of the American Chemical Society, the world’s largest scientific society, he said devices with these chips could retain data despite an accidental trip through the drier — or even a voyage to Mars. And with a unique 3-D internal architecture, the new chips could pack extra gigabytes of data while taking up less space.
“These new chips are really big for the electronics industry because they are now looking for replacements for flash memory,” said James M. Tour, Ph.D., who led the research team. “These new memory chips have numerous advantages over the chips today that are workhorses for data storage in hundreds of millions of flash, or thumb drives, smart phones, computers and other products. Flash has about another six or seven years in which it can be built smaller, but then developers hit fundamental barriers.”
Because of the way that the new memory chips are configured, namely with two terminals per bit of information rather than the standard three terminals per bit, they are much better suited for the next revolution in electronics — 3-D memory — than flash drives.
“In order to put more memory into a smaller area, you have to stack components beyond two dimensions, which is what is currently available,” he said. “You have to go to 3-D.” And the chips have a high on-off ratio, which is a measure of how much electrical current can flow in the chip when it stores information versus when it is empty. The higher the ratio, the more attractive the chips are to manufacturers.
The chips were originally composed of a layer of graphene or other carbon material on top of silicon oxide, which has long been considered an insulator, a passive component in electronic devices. Graphene is a thin layer of carbon atoms that’s touted as a “miracle material” because it is the thinnest and strongest known material. It was even the topic of a recent Nobel Prize. Originally, the researchers at Rice University thought that the amazing memory capability of the chips was due to the graphene. They discovered recently that they were wrong. The silicon oxide surface was actually making the memories, and now they can make them graphene-free. The work was done by Tour’s group in collaboration with Professor Douglas Natelson (Department of Physics) and Lin Zhong (Department of Electrical and Computer Engineering). The main students on the project were Jun Yao and Javen Lin.
The transparency and small size of the new chips enables them to be used in a wide range of potential applications. Manufacturers could embed them in glass for see-through windshield displays for everyday driving, military and space uses so that not only is the display in the windshield, but also the memory. That frees up space elsewhere in the vehicle for other devices and functionalities. In fact, the chips were onboard a recent Russian Progress 44 cargo spacecraft in August 2011 for further experimentation aboard the International Space Station. However, the vehicle never made it into space and crashed. “The spacecraft crashed over Siberia, so our chips are in Siberia!” said Tour. He hopes to send the chips on a future mission in July 2012 to see how the memory holds up in the high-radiation environment of space.
Current touch screens are made of indium tin oxide and glass, both of which are brittle and can break easily. However, plastic containing the memory chips could replace those screens with the added bonuses of being flexible while also storing large amounts of memory, freeing up space elsewhere in a phone for other components that could provide other services and functions. Alternatively, storing memory in small chips in the screen instead of within large components inside the body of a phone could allow manufacturers to make these devices much thinner.
The easy-to-fabricate memory chips are patented, and Tour is talking to manufacturers about embedding the chips into products.
What's the Big Idea?
There's only so much memory that can currently fit on a flash drive, just to name one example of a small-memory device. The structure of the Rice unit -- in which memory is stacked in three-dimensional configurations -- significantly increases the amount of information a chip can hold. The team is currently working with companies who want to adapt their chips to the Rice model, and samples have been sent to the International Space Station to test their tolerance to radiation. Eventually, the team hopes to see these chips mass-manufactured at a reasonable price, and adapted by industries in ways that come straight out of science fiction: "Imagine heads-up windshields or displays with embedded electronics, or even flexible, transparent cellphones."

Monday 10 October 2016

A decentralized web would give power back to the people online


Recently, Google launched a video calling tool (yes, another one). Google Hangouts has been sidelined to Enterprise, and Google Duo is supposed to be the next big thing in video calling.
So now we have Skype from Microsoft, Facetime from Apple, and Google with Duo. Each big company has its own equivalent service, each stuck in its own bubble. These services may be great, but they aren’t exactly what we imagined during the dream years when the internet was being built.
The original purpose of the web and internet, if you recall, was to build a common neutral network which everyone can participate in equally for the betterment of humanity. Fortunately, there is an emerging movement to bring the web back to this vision and it even involves some of the key figures from the birth of the web. It’s called the Decentralised Web or Web 3.0, and it describes an emerging trend to build services on the internet which do not depend on any single “central” organisation to function.
So what happened to the initial dream of the web? Much of the altruism faded during the first dot-com bubble, as people realised that an easy way to create value on top of this neutral fabric was to build centralised services which gather, trap and monetise information.
Search Engines (e.g. Google), Social Networks (e.g. Facebook), Chat Apps (e.g. WhatsApp) have grown huge by providing centralised services on the internet. For example, Facebook’s future vision of the internet is to provide access only to the subset of centralised services it endorses (Internet.org and Free Basics).
Meanwhile, it disables fundamental internet freedoms such as the ability to link to content via a URL (forcing you to share content only within Facebook) or the ability for search engines to index its contents (other than the Facebook search function

The Decentralised Web envisions a future world where services such as communication, currency, publishing, social networking, search, archiving etc are provided not by centralised services owned by single organisations, but by technologies which are powered by the people: their own community. Their users.
The core idea of decentralisation is that the operation of a service is not blindly trusted to any single omnipotent company. Instead, responsibility for the service is shared: perhaps by running across multiple federated servers, or perhaps running across client side apps in an entirely “distributed” peer-to-peer model.
Even though the community may be “byzantine” and not have any reason to trust or depend on each other, the rules that describe the decentralised service’s behaviour are designed to force participants to act fairly in order to participate at all, relying heavily on cryptographic techniques such as Merkle trees and digital signatures to allow participants to hold each other accountable.
There are three fundamental areas that the Decentralised Web necessarily champions:privacy, data portability and security.
Privacy: Decentralisation forces an increased focus on data privacy. Data is distributed across the network and end-to-end encryption technologies are critical for ensuring that only authorized users can read and write. Access to the data itself is entirely controlled algorithmically by the network as opposed to more centralized networks where typically the owner of that network has full access to data, facilitating customer profiling and ad targeting.

Data Portability: In a decentralized environment, users own their data and choose with whom they share this data. Moreover they retain control of it when they leave a given service provider (assuming the service even has the concept of service providers). This is important. If I want to move from General Motors to BMW today, why should I not be able to take my driving records with me? The same applies to chat platform history or health records.

Security: Finally, we live in a world of increased security threats. In a centralized environment, the bigger the silo, the bigger the honeypot is to attract bad actors. Decentralized environments are safer by their general nature against being hacked, infiltrated, acquired, bankrupted or otherwise compromised as they have been built to exist under public scrutiny from the outset.
Just as the internet itself triggered a grand re-levelling, taking many disparate unconnected local area networks and providing a new neutral common ground that linked them all, now we see the same pattern happening again as technology emerges to provide a new neutral common ground for higher level services. And much like Web 2.0, the first wave of this Web 3.0 invasion has walked among us for several years already.
Git is wildly successful as an entirely decentralised version control system – almost entirely replacing centralised systems such as Subversion. Bitcoin famously demonstrates how a currency can exist without any central authority, contrasting with a centralised incumbent such as Paypal. Diaspora aims to provide a decentralised alternative to Facebook. Freenet paved the way for decentralised websites, email and file sharing.
Less famously, StatusNet (now called GNU Social) provides a decentralised alternative to Twitter. XMPP was built to provide a decentralised alternative to the messaging silos of AOL Instant Messenger, ICQ, MSN, and others.

However, these technologies have always sat on the fringe – favourites for the geeks who dreamt them up and are willing to forgive their mass market shortcomings, but frustratingly far from being mainstream. The tide is turning . The public zeitgeist is finally catching up with the realisation that being entirely dependent on massive siloed community platforms is not entirely in the users’ best interests.
Critically, there is a new generation of Decentralised Startups that have got the attention of the mainstream industry, heralding in the new age for real.
Blockstack and Ethereum show how Blockchain can be so much more than just a cryptocurrency, acting as a general purpose set of building blocks for building decentralised systems that need strong consensus. IPFS and the Dat Project provide entirely decentralised data fabrics, where ownership and responsibility for data
These projects show how Blockchain can be so much more than just a cryptocurrency, acting as a general purpose set of building blocks for building decentralised systems that need strong consensus. IPFS and the Dat Project provide entirely decentralised data fabrics, where ownership and responsibility for data
IPFS and the Dat Project provide entirely decentralised data fabrics, where ownership and responsibility for data is shared by all those accessing it rather than ever being hosted in a single location.
The real step change in the current momentum came in June at the Decentralised Web Summit organised by the Internet Archive. The event brought together many of the original “fathers of the internet and World Wide Web” to discuss ways to “Lock the web open” and reinvent a web “that is more reliable, private, and fun.”
Brewster Kahle, the founder of the Internet Archive, saw first hand the acceleration in decentralisation technologies whilst considering how to migrate the centralised Internet Archive to instead be decentralised: operated and hosted by the community who uses it rather being a fragile and vulnerable single service.
Additionally, the enthusiastic presence of Tim Berners-Lee, Vint Cerf, Brewster himself and many others of the old school of the internet at the summit showed that for the first time the shift to decentralisation had caught the attention and indeed endorsement of the establishment.
Tim Berners-Lee said:
The web was designed to be decentralised so that everybody could participate by having their own domain and having their own webserver and this hasn’t worked out. Instead, we’ve got the situation where individual personal data has been locked up in these silos. […] The proposal is, then, to bring back the idea of a decentralised web.
To bring back power to people. We are thinking we are going to make a social revolution by just tweaking: we’re going to use web technology, but we’re going to use it in such a way that we separate the apps that you use from the data that you use.
We now see the challenge is to mature these new technologies and bring them fully to the mass market. Commercially there is huge value to be had in decentralisation: whilst the current silos may be washed away, new ones will always appear on top of the new common ground, just as happened with the original Web. Github is the posterchild for this: a $2B company built entirely as
Github is the posterchild for this: a $2 billion company built entirely as a value-added service on top of the decentralised technology of Git — despite users being able to trivially take their data and leave at any point.
Similarly, we expect to see the new wave of companies providing decentralised infrastructure and commercially viable services on top, as new opportunities emerge in this brave new world.
Ultimately, it’s hard to predict what final direction Web 3.0 will take us, and that’s precisely the point. By unlocking the web from the hands of a few players this will inevitably enable a surge in innovation and let services flourish which prioritise the user’s interests.
Apple, Google, Microsoft, and others have their own interests at heart (as they should), but that means that the user can often be viewed purely as a source of revenue, quite literally at the users’ expense.
As the Decentralised Web attracts the interest and passion of the mainstream developer community, there is no telling what new economies will emerge and what kinds of new technologies and services they will invent. The one certainty is they will intrinsically support their communities and user bases just as much as the interests of their creators