Joshua New: NOAA collects more data than probably any other government agency, and likely more than almost any private sector organization as well. Just how much data does NOAA collect on a daily basis? How do you expect this to change as new data technologies, such as more powerful satellites and the Internet of Things, proliferate?
Ed Kearns: NOAA collects many observations of the atmosphere and oceans, and also generates large volumes of data from computer models that use those observations. Many of the basic, routine observations from satellites and radars are processed into higher level information products that help NOAA meet its mission. On top of that, NOAA’s ships, floats, gliders, and aircraft collect large volumes of data, but not always on a daily basis. Our biologists are collecting not only the traditional fisheries data but are now generating high volumes of genomics information as well. NOAA is a fascinating organization with a complex mission, and its data collection systems are a window into that. So, “how much data” is collected every day is a rather more complex question than it may initially seem!
Superresolution microscopy techniques allow researchers to observe objects tens of nanometers in size on or inside cells. But the methods can only keep an eye on patches 100 μm or less on a side at a time, making it difficult to image multiple cells simultaneously.
Now, by coupling a photonic chip with a standard optical microscope, researchers have achieved superresolution fluorescence microscopy with a simpler set-up and a wider field of view than conventional methods (Nat. Photonics 2017, DOI: 10.1038/nphoton.2017.55).
“You just need a basic microscope,” says Balpreet S. Ahluwalia of the Arctic University of Norway. “Our photonic chip technology can be retrofitted with any standard microscope to convert it into an optical nanoscope.” The chips allow Ahluwalia, Mark Schüttpelz of Bielefeld University, and coworkers to separate the illumination and detection pathways so the two don’t interfere with each other and enable the use of lenses with wider fields of view.
NIST scientists have developed a novel automated probe system for evaluating the performance of computer components designed to run 100 times faster than today’s best supercomputers and consume as little as 1/1000th the energy.
That range of performance, as envisioned in the National Strategic Computing Initiative (link is external) (NSCI), is the overarching goal of many private-sector and federal programs studying different technologies and platforms. One of those is the Cryogenic Computing Complexity (C3) program, supported by the Intelligence Advanced Research Projects Activity (link is external) agency (IARPA). Its aim is to enable a new generation of low-power superconducting supercomputers that operate at liquid-helium temperatures and use ultra-fast switching of microscopic circuit elements called Josephson junctions.
A new study indicates that the number of plant and animal species at risk of extinction may be considerably higher than previously thought. A team of researchers, however, believe they’ve come up with a formula that will help paint a more accurate picture.
The study appears in the journal Biological Conservation.
The maps describing species’ geographic ranges, which are used by the International Union for Conservation of Nature (IUCN) to determine threat status, appear to systematically overestimate the size of the habitat in which species can thrive, said Don Melnick, senior investigator on the study and the Thomas Hunt Morgan Professor of Conservation Biology in the Department of Ecology, Evolution and Environmental Biology (E3B).
A team of NTU researchers led by Professor Ong Yew Soon has developed a prototype software which enables creative individuals to develop sophisticated and visually appealing games in a short time frame.
This advanced AI, named IntelliK, can significantly reduce both the cost and time needed for game developments, which is a key research focus at NTU’s new Data Science & Artificial Intelligence Research Centre (DSAIR).
The new research centre which will receive about S$8 million in funding from NTU over the next three years, has attracted the attention of top international firms.
Ikea makes furniture and household goods, not technology. And yet, the Swedish company’s external innovation lab, Space10, is launching a global survey meant to gauge people’s thoughts about artificial intelligence. Central to the survey’s mission is helping Space10 and Ikea understand what form consumers want AI to take.
Do we want our machines to act like machines, or do we want them to have a personality? Should they be male or female? Should they have names?
Should AI reflect each individual’s worldview? Should it live only on our phones or in our homes, or should it permeate the environment we live in? Should it be able to read and react to our emotions? These are the questions that AI designers and engineers are already asking about the machine learning-enabled technologies–and now Ikea wants to ask the rest of us, as well.
Aditya Shastry of India had two years of statistics experience in the finance field, the start of what would have seemed a lucrative career. But he found his work limiting; he wanted to work on his own project.
He applied to the University of Massachusetts Amherst master’s degree program in data science.
“Instead of learning in the industry specific programs where you have one model you develop, I wanted to get a broader understanding so I could implement something of my own,” he said.
Shastry was one of many students in the UMass Data Science program, now in its second year, who got to rub elbows with data science professionals from the likes of Google, NVIDIA, MassMutual, and dozens of other companies at a research symposium on Thursday, April 27.
The Noah Harding Professor of Computer Science and Professor of Bioengineering at Rice University, Lydia E. Kavraki, has been named the Association for Computing Machinery (ACM) 2017-2018 Athena Lecturer.
Each year, the Athena Lecturer award celebrates women researchers who have made fundamental contributions to computer science. Kavraki has been cited for the invention of randomized motion-planning algorithms in robotics and the development of robotics-inspired methods for bioinformatics and biomedicine
Is the quality and overall state of social and personality research “rotten to the core,” as has been debated by psychologists in recent years?
The answer is no, according to University of Illinois at Chicago researchers who conducted two studies to examine how practices have changed, if at all.
In one study, the UIC researchers surveyed over 1,100 social and personality psychologists from the three largest professional organizations — the Society for Personality and Social Psychology, the European Society for Social Psychology, and the Society for Australasian Social Psychologists — about how the current debate has affected their perceptions of their own research practices and the field’s.
“Is the camera having an impact on the way officers use force? Is it reducing the number of citizens’ complaints? Is it having a negative impact? All of those types of things I would like to know about these cameras,” says Peter Newsham, the chief of police in Washington, D.C., where a similar study is just weeks from providing its first answers.
When officials in D.C. decided to deploy cameras a few years ago, the city happened to have a bunch of researchers who were just waiting to do a big, well-controlled study.
The researchers designed a field experiment to systematically compare cops wearing cameras to officers without cameras in one police force, in a major American city.
By 2021, the average person will have multiple connected mobile devices, and 75 percent of mobile data traffic will be video1. This added video will require new robust technologies to improve the viewer experience.
Carnegie Mellon University today announced that it is collaborating with Intel Corporation on a three-year, $4.125 million research program to unlock the value of the growing volume of online video and put new analytics capabilities and immersive technologies within reach of consumers, businesses and public officials.
What is a wave? A wave in my parlance is a set of articles that make the same new (and possibly erroneous) claim, plus associated social media posts. A wave is significant if it is growing in engagement. Since the cost of human intervention is high, it only makes sense to flag significant waves that have traits that suggest misinformation.
The goal of the detection algorithm is to flag suspicious waves before they cross an exposure threshold, so that human responders can do something about it.
To make this concrete: Let us say that a social media platform has decided that it wants to fully address fake news by the time it gets 10,000 shares. To achieve this they may want to have the wave flagged at 1,000 shares, so that human evaluators have time to study it and respond. For search, you would count queries and clicks rather than shares and the thresholds could be higher, but the overall logic is the same.
Google’s wireless broadband team has a new target market: NASCAR racecars.
The company is seeking permission from the FCC for an experimental radio license to test “the transmission of broadband data from racecars to transportable/fixed base stations located at racetrack facilities.”
It’s all happening this summer at four events in NASCAR-loving locales like Tennessee, Michigan, South Carolina, and Virginia.
Until recently, the word data didn’t require a modifier. But we passed a watershed moment when we started referring to big data. Apparently, that wasn’t a sufficient description for some chunks of data, because people grasped for bolder terms, such as humongous data. Sadly, now, it appears that we have run out of appropriate adjectives. And yet data keeps getting bigger and bigger.
So instead of mentioning data, people have begun waving their hands and talking vaguely about the “cloud.” This seems to be the perfect metaphor—a mystical vapor hanging over Earth, occasionally raining information on the parched recipients below. It is both unknowable and all-knowing. It answers all questions, if only we know how to interpret those answers.
Neural networks, cloud computing, deep learning, and in silico wizardry are on the cusp of disintermediating pharmaceutical drug discovery, cutting billions of billions off the industry’s cost of new drugs and reducing the time to get new medicines approved to just a few processing cycles. “Software eats biotech”, or so goes this new variant of a decades-old thesis. This time could be different – we could be at the singularity when “humans transcend biology” – but I don’t think so.
Minneapolis, MN Create Together Day is a hackathon-style event for anyone with a desire to share great ideas to advance citizen science. Collaborate with citizen science leaders from around the globe to create products, interfaces, or data visualizations to improve specific citizen science projects or the field as a whole. May 17. [$$]
Ottawa, Ontario, Canada The conference focuses on critical questions about data’s power, reflecting on the social and cultural consequences of data becoming increasingly pervasive in our lives. June 22-23. [$$$]
University of California-Berkeley, School of Information
Berkeley, CA The DataEDGE conference at UC Berkeley will bring together senior industry and academic leaders for a conversation about the challenges and opportunities created by the rise of big data. May 8-9. [$$$]
NSF requires a Letter of Collaboration from a regional Big Data Hub for any proposal responding to this solicitation. Requests for Letters of Collaboration may be submitted via the web-based Submission form for Big Data Hub Letters of Collaboration. Deadline is June 19.
Tokyo, Japan ACM IUI 2018 is the 23rd annual meeting of the intelligent interfaces community and serves as a premier international forum for reporting outstanding research and development on intelligent user interfaces. Deadline for papers is October 8.
The Netflix Tech Blog, Mike McGarr and Dianne Marsh
For the past 8 years, Netflix has been building and evolving a robust microservice architecture in AWS. Throughout this evolution, we learned how to build reliable, performant services in AWS. Our microservice architecture decouples engineering teams from each other, allowing them to build, test and deploy their services as often as they want. This flexibility enables teams to maximize their delivery velocity. Velocity and reliability are paramount design considerations for any solution at Netflix.
As supported by our architecture, microservices provide their consumers with a client library that handles all of the IPC logic. This provides a number of benefits to both the service owner and the consumers. In addition to the consumption of client libraries, the majority of microservices are built on top of our runtime platform framework, which is composed of internal and open source libraries.