Sony’s domestic rivals also pulled the plug on their basic research labs one after another in the 2000s.
But the CSL managed to survive the tough times thanks in large part to its unique management system.
Its budget has been basically fixed at one-10,000th of Sony’s overall sales. In its early years, the lab was operated on an annual budget of 200 million yen to 300 million yen ($1.8 million to 2.7 million) because the company’s sales were 2 trillion yen to 3 trillion yen at the time. Its current budget is 800 million yen as Sony is now racking up 8 trillion yen in sales.
Because of this funding system, the lab started with only three engineers and remains as a small squad of around 30 today.
Verily, the life sciences division of Google parent company Alphabet has raised a $1 billion investment round led by private equity firm Silver Lake Partners. Other first time investors in the round include the Ontario Teachers’ Pension Plan.
While the company did not disclose what exactly the giant cash infusion will go toward aside from business babble about “support[ing] growth in key strategic areas,” it did say that Alphabet CFO Ruth Porat and Silver Lake Managing Director Egon Durban will be nominated to join Verily’s operating board.
Scenarios have been discovered in which it is impossible to prove whether or not a machine-learning algorithm could solve a particular problem. This finding might have implications for both established and future learning algorithms.
GE introduced Edison at the end of November, 2018, and will be showing the platform and related tools at its booth. When it launched, GE Healthcare described Edison as an AI platform designed to enable hospitals to reap more value from technology such as clinical apps on devices, in the cloud on the edge by combining diverse data sets from various settings, such as healthcare networks and life sciences companies.
The company is also looking to establish an ecosystem of third-party developers building for its platform. Apps that have already been announced include: AIRx, an automated workflow tool for MRI brain scanning, Critical Care Suite to help identify pneumothorax at the point-of-care, CT Smart Subscription, which offers continuous access to the updates CT software as well as automated Lesion Segmentation on LOGIQ E10 increases productivity through automation. AIRx and Critical Care Suite are awaiting FDA approval.
Like many people, when Harvard lecturer in public policy Mark Fagan first heard the buzz around autonomous vehicles, he wrote it off as wishful thinking. That line of thinking, however, didn’t last long. “I just became convinced from talking to people that it wasn’t crazy, and it was really going to happen, and we ought to be ahead of it,” Fagan said.
As head of the new Autonomous Vehicle Policy Initiative at the Harvard Kennedy School’s (HKS) Taubman Center for State and Local Government, Fagan is working to do just that, help officials craft policies while the technology is still emerging.
“What cities and towns are trying to do with AVs is plan in advance so that they bring them logically to the market in a way that supports public value as opposed to just private value,” Fagan said.
As part of their early effort, Fagan and Rafael Carbonell, executive director of the Taubman Center, reached out to the city of Boston, which has been a test bed for self-driving cars and was looking to dive deeper into the policy side.
In 2019, many of the first drafts of history will be written by artificial intelligence. Rather than spending tens or hundreds of hours synthesising information from thousands of sources, analysts will have a personalised AI that generates written briefings for them in minutes, auto-updating as data inputs change. AI will become a core layer of the stack.
That’s the good news. The bad news is these same technologies are also very good at generating propaganda and disinformation – meaning that they are on the verge of pressure-testing some of our most closely-held democratic processes and norms. The bot-generated propaganda we saw in the 2016 US presidential election was primitive at best. To the extent there was any automation at all, it was crude. In 2019, AI will allow content to be targeted, personalised and optimised to prey on our anxieties and hijack our attention for maximum political advantage. The end result could mean enormous – and possibly irreparable – disruption of democratic processes.
Apart from tech companies like Google and Apple, the Bay Area is home to innovation centers and R&D labs run by household names based outside of California. Walmart, Verizon, Johnson & Johnson, Target, and General Motors all run innovation outposts in the area. Accelerators like Y Combinator open their doors to corporates on demo days. Venture capital firms constantly fund the next wave of startups — and often invite in executives from the Fortune 500, hoping they’ll sign on as early customers.
Our list of 50 of the key sites for “big company” innovation in the Bay Area looks at the region though the lens of larger public companies making significant investments in R&D, innovation, or corporate venture capital. We’ve also included some of the venture capital firms, accelerators, and incubators that have the most interaction with corporates.
Five years ago, a team of researchers pored over the results of a prenatal genetic test given to more than 125,000 healthy pregnant women and made a stunning discovery. The blood test, marketed by gene-sequencing giant Illumina, was designed to detect chromosome anomalies associated with conditions such as Down syndrome by analyzing fragments of fetal DNA circulating in the mother’s blood.
In 3,757 of the tests, the scientists found at least one abnormality. But in 10 of those cases, further analysis revealed that the fetuses were in fact normal.
“In every one of those 10 cases, it turned out there was an undiagnosed cancer” in the mother, says Alex Aravanis, who at the time of the study was the senior director of research and development at Illumina.
To Aravanis and the other scientists, the unexpected result suggested a whole new opportunity: a single blood test for detecting multiple types of cancer before a person has any symptoms.
A bevy of papers at the end of December and this week propose a variety of solutions to make networks more manageable. They include, in no particular order:
Compressing the math needed to compute the weights of a neural network, in some cases reducing them from 32-bit floating point to 8-bit fixed-point (integer) numbers, or “binarizing” them, reducing them to either 1 or 0, or using “symmetry” of matrices, reducing the amount of storage needed to represent them;
“Pruning” the parameters, meaning removing some of the weights and the “activation function,” the computation that makes a given neuron in the neural net respond to the data;
Reducing the amount of data sharing over a network when running on many distributed computer systems, such as by selectively deciding with neurons exchange information about weights, or by selectively partitioning the computation across different processors;
New kinds of neural network algorithms, and ways to partition them, to make more efficient use of new kinds of hardware.
Minneapolis, MN June 3 at Minneapolis Central Library, “an all-day unconference where we investigate ways in which the creative coding and library communities can work together.” [registration required]
“The National Geographic Society (NGS) is seeking applications for their new Conservation Technologies RFP to create novel tools and technologies to monitor ecosystem health.” Deadline for applications is January 9.
Call for Nominations for the Replication Award – “The OHBM Replication Award recognizes the best replication study and highlights OHBM’s commitment to reproducibility in neuroimaging research and helps begin to reshape the incentives towards replication.” Deadline for nominations is January 11.
“Neurohackademy is a two-week hands-on summer institute in neuroimaging and data science, held at the University of Washington eScience Institute, July 29th – August 9th, 2019.” Deadline to apply is February 18.
Journal of the American Medical Informatics Association; Willem G van Panhuis, Anne Cross, Donald S Burke
from
In 2013, we released Project Tycho, an open-access database comprising 3.6 million counts of infectious disease cases and deaths reported for over a century by public health surveillance in the United States. Our objective is to describe how Project Tycho version 1 (v1) data has been used to create new knowledge and technology and to present improvements made in the newly released version 2.0 (v2). [full text]