EPFL’s Swiss Plasma Center (SPC) has decades of experience in plasma physics and plasma control methods. DeepMind is a scientific discovery company acquired by Google in 2014 that’s committed to ‘solving intelligence to advance science and humanity. Together, they have developed a new magnetic control method for plasmas based on deep reinforcement learning, and applied it to a real-world plasma for the first time in the SPC’s tokamak research facility, TCV.
Nature Communications; Devis Tuia et al.; h/t Twitter, Yann Lecun
Inexpensive and accessible sensors are accelerating data acquisition in animal ecology. These technologies hold great potential for large-scale ecological understanding, but are limited by current processing approaches which inefficiently distill data into relevant information. We argue that animal ecologists can capitalize on large datasets generated by modern sensors by combining machine learning approaches with domain knowledge. Incorporating machine learning into ecological workflows could improve inputs for ecological models and lead to integrated hybrid modeling tools. This approach will require close interdisciplinary collaboration to ensure the quality of novel approaches and train a new generation of data scientists in ecology and conservation.
U.S. research institutions and universities are gearing up to implement steps announced last month by the Biden administration to ensure that scientists seeking federal grants are not beholden to foreign governments or interests.
The White House National Science and Technology Council issued a set of guidelines in January designed to ensure that scientists seeking federal grants do not have conflicts of interest stemming from their participation in foreign talent recruitment programs. The guidelines address a presidential national security memorandum issued in early 2021.
That memorandum required any research institution receiving more than $50 million in federal science and technology grants in a year to certify that it has a research security program that can identify conflicts of interests.
University of Wisconsin, College of Life & Agricultural Sciences, CALS News
To better understand the media coverage about soil conservation practices, researchers from the University of Wisconsin–Madison used advanced data science methods to collect and analyze online stories about soil conservation from Wisconsin’s most-read agricultural print media outlets: Agri-View, The Country Today, Wisconsin Agriculturalist, and Wisconsin State Farmer.
“In considering the potential benefits of soil conservation practices, it is important to understand where farmers get their information about these practices and what types of information they are exposed to and by whom,” says study co-author Bret Shaw, associate professor in the Department of Life Sciences Communication and environmental communication specialist for the Division of Extension at UW–Madison. “We know these media outlets reach a lot of farmers, and it’s valuable to know the role they play in getting information out about soil conservation, given how important agriculture is in Wisconsin.”
From facial recognition to autonomous vehicles, the adoption of artificial intelligence-driven applications has exploded so quickly that it has outpaced efforts to understand how these machine-learning technologies affect human life.
To help examine artificial intelligence (AI) systems and evaluate their impact, Underwriters Laboratories Inc. and Northwestern University today announced the creation of a research hub that seeks to better incorporate safety and equity into the fast-growing technology.
The Digital Intelligence Safety Research Institute (DISRI) at Underwriters Laboratories will support the research collaboration, committing $7 million to the research hub over the next three years as well as expertise and resources. Northwestern will host, and the two institutions will jointly lead, the research and operations of the new hub, the Center for Advancing Safety of Machine Intelligence (CASMI). The DISRI-CASMI partnership aims to bring together and coordinate a wide-ranging research network focused on maximizing machine learning’s benefits while recognizing and averting potential negative effects.
Mathematicians often work together when they’re searching for insight into a hard problem. It’s a kind of freewheeling collaborative process that seems to require a uniquely human touch.
But in two new results, the role of human collaborator has been replaced in part by a machine. The papers were completed at the end of November and summarized in a recent Nature article.
“The things that I love about mathematics are its intuitive and creative aspects,” said Geordie Williamson, a mathematician at the University of Sydney and co-author of one of the papers. “The [machine learning] models were supporting that in a way that I hadn’t felt from computers before.”
Stanford HAI and the Stanford Digital Economy Lab submitted this response in January 2022 to support the work of the National Institute of Standards and Technology (NIST) to advance a more productive technology economy. NIST’s work in this area is more essential than ever as remote work, artificial intelligence (AI), and other new technologies change the job landscape and the future of the economy. In this submission, we discuss why productivity has not been sufficiently understood or measured, especially from AI; how AI is still emerging and may help workers and economic growth; and the need for immigration reform to attract and retain top talent. This response was co-lead by Erik Brynjolfsson and Georgios Petropoulos.
The Pew Charitable Trusts, Stateline, Jenni Bergal
Cybersecurity experts say QR codes also created new opportunities for fraudsters, who can tamper with them and direct victims to malicious websites to steal their personal and financial information.
“During the pandemic, they looked at how people were engaging and ways to manipulate that,” said Angel Grant, who tracks QR code fraud as vice president of security at F5, a Seattle-based app security company. “Cybercriminals always look for disruption to cause disruption.”
One of the newest QR code scams has targeted drivers at pay-to-park kiosks in several large Texas cities.
The National Humanities Center (NHC) is pleased to announce a new initiative to bolster college-level curricula for developing responsible artificial intelligence (AI) technologies.
Supported by a gift from Google, the NHC will partner with faculty from fifteen colleges and universities to create and implement courses designed to help students comprehend the myriad ways AI technologies are integrated into modern life and to think through the ethical issues involved in developing and deploying them.
Marquette University has announced the establishment of the Center for Data, Ethics, and Society within the Klingler College of Arts and Sciences to address the ethical, social and political dimensions of an increasingly data-driven society.
Grounded in the university’s Catholic, Jesuit mission of social justice, the center will focus on confronting data ethics issues such as the controversies and consequences of our increasingly data-driven lives and spaces, unfair algorithmic biases, the unequal effects of computational decision-making, the privacy threats of ubiquitous surveillance systems, and the role for corporate social responsibility and accountability. The center will take up these concerns through an applied and interdisciplinary approach to research, pedagogy and community engagement.
“The Center for Data, Ethics, and Society is an exemplary expression of our Catholic, Jesuit values as data and the ethics around data management become increasingly crucial topics,” said Dr. Heidi Bostic, dean of the Klingler College of Arts and Sciences. “Addressing fundamental concerns of justice and fairness in our data-saturated world—which particularly affect underrepresented groups—requires a commitment to diversity and inclusion. The center will seek to create inclusive curricula, collaborate with a diverse group of faculty and community partners, and create a welcoming community that expands opportunities for students from all backgrounds.”
The Chinese government’s pervasive surveillance of Xinjiang’s Muslim Uyghur population makes any contact Uyghurs have with friends and relatives overseas a potential red-flag for police scrutiny. When four family members died in succession in recent years, a heartbroken Aksu refrained from calling home to ask about the circumstances of their deaths to shield relatives from potentially dangerous state scrutiny for having overseas ties.
So when I talked to Aksu in November, I made sure to use Signal, an encrypted phone app, to protect our discussion about psychological trauma afflicting Uyghurs overseas.
The next day, I received an odd note from Otter.ai, the automated transcription app that I had used to record the interview. It read: “Hey Phelim, to help us improve your Otter’s experience, what was the purpose of this particular recording with titled ‘Mustafa Aksu’ created at ‘2021-11-08 11:02:41’?”
Biden named Alondra Nelson, the deputy director for science and society in the OSTP, as director of the office. The president also announced that Francis Collins, who retired in December from his role as director of the National Institutes of Health, would serve as his top science adviser and co-chair of the president’s Council of Advisors on Science and Technology.
Nelson and Collins will perform these roles until “permanent leadership is nominated and confirmed,” the White House said.
“These appointments will allow OSTP and the President’s Science and Technology agenda to move seamlessly forward under proven leadership,” the statement said.
Many universities are upgrading their networks to support research, teaching and learning and other campus activities. In some cases, they are upgrading from 10Gbps connections to 100Gbps or multiple 100Gbps connections.
For researchers, the combination of faster networks and high-performance computing (HPC) systems enables researchers to accelerate their work as they seek new discoveries and scientific breakthroughs, from astronomers finding new insights about the universe to scientists improving crop production.
“In academia, research work requires a lot of horsepower to process and move data packets around the network,” says Will Townsend, senior analyst at Moor Insights & Strategy, a global technology analyst and advisory firm.
In January 2023, the US National Institutes of Health (NIH) will begin requiring most of the 300,000 researchers and 2,500 institutions it funds annually to include a data-management plan in their grant applications — and to eventually make their data publicly available.
Researchers who spoke to Nature largely applaud the open-science principles underlying the policy — and the global example it sets. But some have concerns about the logistical challenges that researchers and their institutions will face in complying with it. Namely, they worry that the policy might exacerbate existing inequities in the science-funding landscape and could be a burden for early-career scientists, who do the lion’s share of data collection and are already stretched thin.
The mandate, in part, aims to tackle the reproducibility crisis in scientific research. Last year, a US$2-million, eight-year attempt to replicate influential cancer studies found that fewer than half of the assessed experiments stood up to scrutiny. Efforts to tally the cost of irreproducible research in the United States have found that $10 billion to $50 billion is spent on studies that use deficient methods, a cost that is mostly fronted by public funding agencies.
Harvard Business Review, H. James Wilson and Paul R. Daugherty
In a previous HBR article (“Collaborative Intelligence: Humans and AI Are Joining Forces,” July–August 2018), we described how some leading organizations are defying the conventional expectation that technology will render people obsolete—they are instead using the power of human-machine collaboration to transform their businesses and improve their bottom lines. Now several companies are not merely out-innovating their competitors with this approach; they’re turning even more decisively toward human-centered AI technology and upending the very nature of innovation as it was practiced over the previous decade.
In the NFL’s case, for example, AI accelerated the image-recognition process, but the system would have failed without employees determining which data needed to be uploaded and then approved. And the NFL didn’t simply hand the job of making highlight reels over to AI; content creation experts performed that work, but they did it faster and more easily thanks to AI’s unique ability to quickly sort through massive volumes of information.
The facial recognition giant wants to work with private companies in retail and the gig economy. And with $50 million, they said they can get to 100 billion faces, enough to identify everyone on earth
I’m going to thread a bunch of slides from the pitch deck (but I won’t be sharing the full document for source-protection purposes). It’s from December, and Clearview’s chief confirms it was sent to a “small group” of potential investors.
The new leader of a little-known agency within the Commerce Department starts the job tasked with connecting every American to the internet, but also has ambitions to tackle Big Tech issues on the horizon.
Why it matters: Alan Davidson, the newly confirmed head of the National Telecommunications and Information Administration (NTIA), will manage tens of billions of federal spending on broadband — but he’s also talking about helping set administration policy around app stores and privacy.
The grant is an extension of AI Jumpstart, a $4 million program launched in April to implement AI infrastructure, and will provide businesses with an opportunity to work with faculty from Northeastern University — the lead school on the program — and Tufts and Boston universities as they work with the resources gained from the launch of AI Jumpstart.
“We’re pleased to see the geographic impact that this program will have, impacting businesses from Boston to the Berkshires,” said Pat Larkin, director of the Innovation Institute at MassTech, a public economic development agency that fosters competition in the technology industry by creating partnerships with academia and businesses. “This program will show the impact AI can have on our ‘Made in Massachusetts’ companies, showing that AI isn’t only for large, global corporations, but for small businesses looking to innovate and grow.”
The two Valley companies to receive grants are Automatic Controversy Detection Inc. (AuCoDe), of Granby, and South Deerfield-based Worthington Assembly.
University of California, Berkeley; UC Berkeley School of Information
Online March 7-9. “We will stream content from the global WiDS events and provide live, virtual programming for the UC Berkeley campus throughout the week featuring distinguished scholars and practitioners from across the Bay Area and the world.” [registration required]
“In this competition, you’ll identify elements in student writing. More specifically, you will automatically segment texts and classify argumentative and rhetorical elements in essays written by 6th-12th grade students. You’ll have access to the largest dataset of student writing ever released in order to test your skills in natural language processing, a fast-growing area of data science.” Deadline for entries is March 8.
The eScience Institute’s Data Science for Social Good program is now accepting applications for student fellows and project leads for the 2021 summer session. Fellows will work with academic researchers, data scientists and public stakeholder groups on data-intensive research projects that will leverage data science approaches to address societal challenges in areas such as public policy, environmental impacts and more. Student applications due 2/15 – learn more and apply here. DSSG is also soliciting project proposals from academic researchers, public agencies, nonprofit entities and industry who are looking for an opportunity to work closely with data science professionals and students on focused, collaborative projects to make better use of their data. Proposal submissions are due 2/22.
Gets everyone thinking more clearly, and therefore communicating more clearly, about their work: Have you ever had the feeling you know a thing inside out, but then as soon as it comes time to talk about that thing in a meeting, the words come out all wrong? I have. In those moments it can be hard to tell whether the gap in clarity is in the idea itself or the communication of it. But the truth is, in the work context, it doesn’t matter. When you codify ideas in a piece of writing, you also codify your thinking.