After closing a $20 million Series A financing round last December, Stanford University spinout Deepcell is transitioning this year from a period of quiet technology development to developing commercial inroads for its artificial intelligence-driven cell isolation technology, which the company believes can support a new generation of molecular, phenotypic, and translational research.
Founded in 2017 by Stanford professor Euan Ashley, his postdoc Maddison Masaeli, and their collaborator, computer scientist Mahyar Salek, the company has developed a platform to isolate, analyze, and classify individual cells from tissue or blood samples, using a combination of image-based machine learning and microfluidics. The method allows for the delivery of intact and viable single cells with the ability to select for and separate morphological cell subpopulations without the bias of predetermined parameters.
According to the company, the technology can isolate cells occurring down to frequencies as low as one in a billion, with potential applications across areas including single-cell genomics, liquid biopsy, prenatal diagnosis, characterization of cellular and molecular interactions in specific disease states, and drug development.
It seems obvious that the opinions of 2.3 million people would be more representative than the opinions of a randomly selected 400. In reality, it depends entirely on how the bigger data set was put together.
Hoping that high quantity can compensate for low quality is a classic mistake in the burgeoning field of big data, says Xiao-Li Meng, a professor of statistics at Harvard who’s the founding editor-in-chief of the 2-year-old Harvard Data Science Review.
The Interactive Advertising Bureau (IAB), the national trade association for the digital media and marketing industries, is focusing its AI Standards Working Group to develop artificial intelligence (AI) standards, best practices, use cases, and terminologies in an effort to scale AI and enable the industry on its full potential. The group is newly co-chaired by IBM Watson Advertising and Nielsen.
The first release of 2021, “Artificial Intelligence Use Cases and Best Practices for Marketing,” will help executive leaders, marketers, and technologists get the most from AI, and do it responsibly.
The decline of ad-supported journalism starts with asymmetry in measurement. When a free publisher includes an additional ad on a page, the positive impact of the additional ad revenue is immediate and easily measured. However, measuring and attributing the associated long-term downside is not as straightforward.
Services that require user accounts like Facebook or NYT can easily get a holistic view of their user’s long-term behavior across devices because all usage is tied back to an account. Unfortunately, most free publishers have yet to find a compelling reason to get their visitors to create an account, meaning the best they can do is tie usage back to a cookie. In our mobile first world, cookies are fickle at best. Even on a single device, they typically aren’t shared between the main browser and the browsers in apps, and they definitely aren’t shared across multiple devices.
If you can’t accurately attribute product changes to long-term changes in user behavior, you’re going to struggle to optimize your product. Combine this lack of precise tracking with the inherent volatility of the news cycle, and you can see why it becomes very difficult for a free publisher to answer the question ‘what’s the harm?’ when deciding if they should include yet another ad on their page.
“We have these students saying these things that I did not expect them to so openly share,” said Tara Hughes, the “voice” of Ekhobot at CSU Channel Islands. “Students saying, ‘I really miss my roommate, they were my best friend.’ Some who went home and ended up becoming caregivers for their parents … or now became a sole breadwinner.”
Csunny is a robot that uses artificial intelligence to text at Cal State Northridge.
The CSU chatbots like Billy, whose name was inspired by the school’s mascot, Billy Bronco, were designed with a different purpose in mind.
A few years ago, Elizabeth Adams, associate vice president of undergraduate studies at Cal State Northridge, heard about Georgia State University using a text bot to help reduce the summer drop-off in students who plan to enroll in college but ultimately don’t.
“I thought, ‘We need that — but for equity,’” Adams said, referring to the unequal academic outcomes experienced by low-income, first-generation and other underrepresented students.
Since early in the COVID-19 pandemic, researchers and policymakers have raised serious concerns about impacts on women, including those pursuing careers in academic science. Today, the U.S. National Academies of Sciences, Engineering, and Medicine adds to the chorus with its report on how women academics in science, technology, engineering, math, and medicine are faring.
In the 253 pages of the report—which reinforces existing concerns, particularly for women who are caregivers—some of the most resonant pieces are survey responses from women faculty members about the challenges they faced during the first 6 months of the pandemic. “The experiences described in the survey are heartbreakingly vivid and all too familiar,” says Reshma Jagsi, an oncology professor at the University of Michigan and a member of the committee that commissioned the report. Here is a selection.
People often feel that they can intuitively recognize whether something is alive, but nature is filled with entities that flout easy categorization as life or non-life — and the challenge may intensify as other planets and moons open up to exploration. In this excerpt from his new book, Life’s Edge: The Search for What It Means to Be Alive, published today, the science writer Carl Zimmer discusses scientists’ frustrated efforts to develop a universal definition of life.
Carnegie Mellon University, School of Computer Science
from
Katharine “Kit” Needham has been named the School of Computer Science’s inaugural assistant dean for entrepreneurship initiatives.
Needham, who also serves as the director of Project Olympus, joined Carnegie Mellon in 2008 to work on the then newly formed startup incubator with its founding director Lenore Blum.
“Even today, many people don’t know what Project Olympus is,” Needham said. “Hopefully, this will give a little more visibility to our programs and what we do. It sends a strong message.”
One of the hardest parts of genetic research is reading DNA. Every cell of our body contains a copy of our entire genetic code, but only some of that genetic code is actually used.
Now, researchers at Harvard University’s Department of Stem Cell and Regenerative Biology working with GPU manufacturer NVIDIA have developed a method of quickly and accurately identifying the wadded up DNA buried in our cells, using machine learning and GPUs. It might help us detect cancer and genetic disease earlier and faster.
Aaron Morris, co-founder and CEO of East Liberty-based startup Allvision, says most people wouldn’t believe how many times the assets of a highway change.
He’s talking about the signs that tell you what exit is coming up or what town you’re in, as well as the barriers and guard rails along the road.
That’s a lot for humans to keep track of, so Allvision plans to use computer vision and machine learning to create a 3D digital representation that can help point out missing signs or dangerous gaps in the guardrails.
To make that happen, Allvision announced a partnership Tuesday with TomTom, an Amsterdam-based location tech specialist. TomTom offers mapping tools, navigation software and real-time traffic information and services that can be used for in-car navigation systems, smart mobility initiatives and autonomous vehicles.
Today critics charge that artificial intelligence (AI) feeds on biased datasets, amplifying the existing anti-female biases of our societies, and that AI is perpetuating harmful stereotypes of women as submissive and subservient. Is it any wonder when only 22% of AI professionals globally are women?
On International Women’s Day (8 March), UNESCO and the World Economic Forum joined forces to host an online panel on gender equality and women’s leadership in Artificial Intelligence. This timely round-table brought together a range of leading female voices in tech from around the world to confront the deep-rooted gender imbalances skewing the development of artificial intelligence. More than 60,000 viewers participated in the digital event. The event was moderated by Natashya Gutierrez, Editor-in-Chief of VICE Asia.
“How do we stop bias? By making sure that women are not only consumers, but producers of AI: We need more female intelligence in artificial intelligence – in the data, in algorithms, and in the sector”, Gabriela Ramos, Assistant Director-General for Social and Human Sciences, UNESCO.
A BC initiative with the goal of closing the gender gap in artificial intelligence has helped advance the careers of 250 women in AI, machine learning, and data science.
Athena Pathways is a Digital Supercluster initiative that officially launched in March 2020 with a goal of enrolling 500 BC women into the fields of artificial intelligence, data science, and machine learning by September 2021.
“In addition to drawing more talent into the ecosystem, Athena will work to correct a gender imbalance that threatens the performance of AI technology itself if left unchecked; models are algorithms fitted to data, and diversity is a critical part of accurate models,” Norma Sheane, administration manager of the Artificial Intelligence network of British Columbia (AInBC), told Daily Hive in an interview.
Scientists have made a major advance in harnessing machine learning to accelerate the design for better batteries. Instead of using machine learning just to speed up scientific analysis by looking for patterns in data—as typically done—the researchers combined it with knowledge gained from experiments and equations guided by physics to discover and explain a process that shortens the lifetimes of fast-charging lithium-ion batteries.
It was the first time this approach—known as “scientific machine learning”—has been applied to battery cycling, said Will Chueh, an associate professor at Stanford University and investigator with the Department of Energy’s SLAC National Accelerator Laboratory who led the study. He said the results overturn long-held assumptions about how lithium-ion batteries charge and discharge and give researchers a new set of rules for engineering longer-lasting batteries.
The research, reported in Nature Materials, is the latest result from a collaboration between Stanford, SLAC, the Massachusetts Institute of Technology and Toyota Research Institute (TRI). The goal is to bring together foundational research and industry know-how to develop a long-lived electric vehicle battery that can be charged in 10 minutes.
The researchers employed a machine-learning algorithm to tease out the heartbeats from other sounds and signals such as breathing, which is easier to detect because it involves a much larger motion. The algorithm was also needed to zero in on erratic heart rhythms — which from a health perspective are generally more important to identify than a steady “lub-dub.”
“If you have a regular pattern, it is easy to find,” said Dr. Arun Sridhar, assistant professor of cardiology at the UW School of Medicine. “If it’s all over the place, it’s challenging.”
In just over a year, the virus that causes COVID-19 has become the most sequenced virus of all time—soaring past such longtime contenders as HIV and influenza. Thousands of coronavirus genomes are sequenced around the world every day; several were generated in just the minute it’s taken for you to read these three paragraphs. “It’s been a revolution,” says Judith Breuer, a virologist at University College London.
We are now living through the first pandemic in human history where scientists can sequence fast and furiously enough to track a novel virus’s evolution in real time—and to act decisively on that information. Viruses constantly acquire mutations—genetic typos—and occasionally they mutate into a variant of interest. It was sequencing that identified a distinct and more transmissible variant in the UK. It was sequencing that prompted stricter lockdowns in response. And it’s now sequencing that is tracking the spread of variants, including those first found in South Africa and Brazil, that have mutations blunting immunity from vaccines and previous infections.
Online March 16-18. “USD’s first Artificial Intelligence Symposium aims to unite academia, industry and government AI & Data Engineering experts to solve current challenges in various applications such as healthcare, cyberthreats, quantum computing, sustainable agriculture and risk management.” [free, registration required]
“For all tasks, we provide pose estimates of socially interacting mice in a standard resident-intruder assay, tracked using MARS (preprint: https://biorxiv.org/content/10.1101/2020.07.26.222299v1.full). For simplicity, video data will not be provided.” Challenge runs until April 30.
SPONSORED CONTENT
The eScience Institute’s Data Science for Social Good program is now accepting applications for student fellows and project leads for the 2021 summer session. Fellows will work with academic researchers, data scientists and public stakeholder groups on data-intensive research projects that will leverage data science approaches to address societal challenges in areas such as public policy, environmental impacts and more. Student applications due 2/15 – learn more and apply here. DSSG is also soliciting project proposals from academic researchers, public agencies, nonprofit entities and industry who are looking for an opportunity to work closely with data science professionals and students on focused, collaborative projects to make better use of their data. Proposal submissions are due 2/22.
The list covers high-level insights, migrations, deadlock reduction and lambdas. Instead of focusing too deeply on technology, I’m aiming to improve the UX of using a database and to reduce the psychological burden of what can sometimes be seen as a brittle, arcane piece of infrastructure. That sounds like an exaggeration, but I’ve worked at companies where “that’ll need a database change” elicited sighs from the room. I’ve also worked for companies where the databases were offloaded to specialists and the rest of the team felt that there was a barrier to using them. These ideas are part of a foundation that addresses UX concerns and allows future databases to be a continuing source of empowerment for entire teams.
For now, find it here! http://vis.csail.mit.edu/pubs/beyond-expertise-roles.pdf Shout out to wonderful collaborators @arvindsatya1
@steveg_cs
@kevnam
note: We view “interpretability” broadly, i.e. any (static or interactive) insight into the system, whether prediction uncertainty, data collection info, more mechanistic explanations, etc. We use “interpretability” because of precedent but welcome ideas re: better/broader terms.