For most freight companies, the answer to the troublesome driver shortage lies in Gen Z – younger individuals who have never yet sat behind the wheel of a cab. While regulatory measures have traditionally required certain age limits in commercial vehicle operation, Congress is currently reviewing a bill that would ease these rules. The DRIVE-Safe Act, a bipartisan bill, would allow individuals under the age of 21 to cross state lines through a two-step apprenticeship program, greatly increasing the pool of potential driving candidates when it’s needed the most.
Yet this change doesn’t come without several concerns, key among them, experience and safety. Understandably, a first-time truck driver may not have the skill to maintain proper safety procedures the same way a 10-year veteran would. That puts a significant onus on fleets to find ways to support these drivers through technology, implementing systems like artificial intelligence (AI) and predictive analytics to offer guidance in real time and help new drivers develop their skills.
All the evidence suggests it’s tough to get hold of talented data scientists – and that’s even true at NASA, says David Meza, acting branch chief of people analytics and senior data scientist at the US space agency.
So what does a skills gap look like at NASA? Meza says his team is still taking a “deep dive” into the organisation’s data science talent demands – but clear patterns are emerging, particularly in terms of identifying capability that already exists within the organisation.
“One of the biggest challenges has been to identify where our data science skills are within NASA. It’s not a terminology or an occupation that’s been labelled data science within the government. It’s still something that’s in development to have a work role or an occupation of ‘data science’,” he says.
… In our view, the current model, in which the digital traces of our lives are monopolized by corporations, threatens the ability of society to produce the rigorous, independent research needed to tackle pressing issues. It also restricts what information can be accessed and the questions that can be asked. This limits progress in understanding complex phenomena, from how vaccine coverage alters behaviour to how algorithms influence the spread of misinformation.
Instead, we call for the creation, management and curation of behavioural data in public data trusts.
Access denied
This political economy of data puts social scientists in a difficult position. Access comes with conditions: companies have an active interest in the questions researchers ask (or don’t) as well as the data they can access and how it is analysed. And it is rarely possible for scientists to determine what information was not included when the gatekeepers do grant access, or how the data were generated in the first place.
Duke University, Kenan Institute for Ethics, Erin Miller
from
As illuminated by the pandemic, surgeons often must make decisions quickly, and those decisions can be affected by ignorance, emotion, bias. If a donor is killed in an auto accident, for example, the surgeon might need to decide who should receive the transplant without having much time to review all cases thoroughly. At times like these, a clinician’s moral judgment might not represent the values of the hospital or the public.
[Walter] Sinnott-Armstrong and colleagues believe that computer technology in the form of Moral Artificial Intelligence (AI) might lead to fairer decision-making and reduce bias and its effects on racial and economic injustice in decisions about scarce medical resources. They recently received funding from the University for a collaboratory to expand their work in Moral AI with colleagues at Duke and Duke Kunshan by researching moral judgments in these sorts of medical situations. The collaboratory aims to develop Moral AI that could be used to help doctors and hospital personnel make better judgments about who receives scarce resources.
The 35 Innovators Under 35 is our yearly opportunity to take a look at not just where technology is now, but where it’s going and who’s taking it there. More than 500 people are nominated every year, and from this group the editors pick the most promising 100 to move on to the semifinalist round. Their work is then evaluated by our panel of judges who have expertise in such areas as artificial intelligence, biotechnology, software, energy, and materials. With the insight gained from these rankings, the editors pick the final list of 35.
This week, Nature is publishing a special collection of articles with the objective of bridging the different research disciplines and different perspectives on doing science that underpin computational social science. This will be discussed at an upcoming panel at the joint NetSci-Sunbelt conference “Networks 2021” on July 2, 2021. Information about registering is provided below.
“Finally in the coherence / we weep”: the words are in serif font and the letters slightly effaced. The w in “weep,” in particular, is missing flecks of ink, and I know that Kameelah Janan Rasheed must have considered its acutely threadbare shape when she placed this word on the bottom right corner of an unnumbered page about three-quarters of the way through her 2019 book No New Theories. Throughout the book, and across her art practice, Rasheed attends to text that is tentatively legible and partially withheld.
Rigorously interdisciplinary and a self-named “learner,” Rasheed often makes work in the form of immersive installations stuffed with scraps of text that she finds or writes. Taking over surfaces like the facade of the Brooklyn Museum, a massive digital billboard in Times Square, or the walls and crevices of a gallery in Berlin, the artist makes architectural elements from poetic language, using her fine-tuned eye for typeface, color contrasts, and scale to bring viewers into a physically active reading experience.
At more than 70 percent of colleges, placement tests determine whether students need to take remedial courses. If those tests are inaccurate, students may find themselves incorrectly placed on a remedial track and enrolled in noncredit classes that delay them from earning their degrees and increase the cost of their education.
A working paper, one in a series released by the National Bureau of Economic Research in June, suggests that placement tests could be replaced by an algorithm that uses a more wide-ranging set of measures to predict whether a student would succeed in credit-bearing college courses.
The authors developed an algorithm and tested it in an experiment that included 12,544 first-year students across seven different community colleges in the State University of New York system, observing a subsample of students for two years.
“With WatSPEED, organizations can partner with the University of Waterloo to create custom professional development content to equip their workforce to stay competitive and continuously evolve to keep pace with technological, societal, economic and environmental disruption,” said Sanjeev Gill, associate vice-president, innovation and executive director, WatSPEED at the University of Waterloo.
“WatSPEED is a unique approach to providing relevant education that will prepare professionals for a complex future. It will build on Waterloo’s foundation of academic excellence and strong ties to industry to help create a future-proof workforce.”
The Texas A&M Engineering Experiment Station (TEES) recently received a proposed five-year, up to $24-million contract from the Army Research Laboratory (ARL) to conduct basic research in establishing a collaborative distributed proving ground that will support autonomous vehicle research across various environments and domains at the Bush Combat Development Complex (BCDC) on The Texas A&M University System RELLIS Campus.
The research will be focused on developing virtual proving grounds designed to enable researchers to develop, test and demonstrate artificial intelligence and machine learning algorithms for autonomous vehicles. Visual, thermal, LIDAR and RADAR datasets in relevant and diverse environments will also be collected, annotated and curated in both real and virtual environments that can be used to evaluate artificial intelligence, machine learning and autonomy algorithms in real and synthetic environments.
Looming large over the entire issue is an as-yet unfulfilled promise made by investor-billionaire Charlie Munger in 2016 to build new undergraduate dormitories for the school, which his grandson attended. He pledged $200 million to do so with the caveat that he be allowed to personally draft the designs. His blueprints currently feature suites of eight single bedrooms surrounding large common spaces. Most of the bedrooms, he has said, would be fitted with artificial windows modeled after portholes on Disney cruise ships. Customized lighting would mimic daylight.
Munger, 97, is a longtime business partner of Warren Buffet and has in recent years developed a sideline of sorts financing and constructing unconventional but highly efficient buildings at major universities. In 2013, he put $110 million toward a student housing project at the University of Michigan, which also lacked windows.
A new pilot program at the University of Toronto will embed ethics modules into existing undergraduate computer science courses in a bid to ensure future technologies designed and deployed in ways that consider their broader societal impact.
From learning about the complex trade-off between data privacy and public benefit to making design decisions that impact marginalized communities, the pilot program – led by the department of computer science, in the Faculty of Arts & Science, and the Schwartz Reisman Institute for Technology and Society (SRI) – will teach computer science students skills to identify potential ethical risks in the technologies they are learning to build.
Peter K. Enns, professor, Jeb E. Brooks School of Public Policy and Department of Government in the College of Arts and Sciences, and executive director of the Roper Center for Public Opinion Research, has been named Robert S. Harrison Director of the Cornell Center for Social Sciences (CCSS), which accelerates, enhances, and amplifies social science research at Cornell. The announcement was made by Emmanuel Giannelis, vice president for research and innovation.
UC Berkeley’s new vice chancellor for equity and inclusion wants the top public university to be “a place where people feel that their humanity is affirmed and that they develop as transformative practitioners — a place that they’re proud to be a part of.”
Dania Matos (she/her/ella), who is currently the University of California, Merced’s first associate chancellor and chief diversity officer, will take on the top administrative role on Aug. 16, Chancellor Carol Christ announced today (Friday, July 2).
Online August 8-12. “The theme for JSM 2021 is Statistics, Data, and the Stories They Tell. The website has lots of great resources, including an online program with a search engine. The search engine can be accessed without registering, so you can see the best presentations to your work and make the case for financial support to attend.” [registration required]
National Academies of Sciences, Engineering, and Medicine
from
Online and Washington, DC October 14-15. “The Board of Science Education of the National Academies of Sciences, Engineering, and Medicine is hosting a public summit October 14-15, 2021 to take stock of the implementation of state science standards and determine the next steps to consider for continuing or reinvigorating implementation efforts.” [registration required]
Online September 9, starting at 9:30 a.m. Eastern. “With our speakers – philosophers, psychologists, neuroscientists and AI researchers – we will try to map different meanings of innateness in different fields, discuss if those meanings are compatible and what are implications of it; we will think how this concept can be used and what are the limits of its usefulness.” Deadline to apply is August 10.
SPONSORED CONTENT
The eScience Institute’s Data Science for Social Good program is now accepting applications for student fellows and project leads for the 2021 summer session. Fellows will work with academic researchers, data scientists and public stakeholder groups on data-intensive research projects that will leverage data science approaches to address societal challenges in areas such as public policy, environmental impacts and more. Student applications due 2/15 – learn more and apply here. DSSG is also soliciting project proposals from academic researchers, public agencies, nonprofit entities and industry who are looking for an opportunity to work closely with data science professionals and students on focused, collaborative projects to make better use of their data. Proposal submissions are due 2/22.
PyUnity is a Python implementation of the Unity Engine, written in C++. This is just a fun project and many features have been taken out to make it as easy as possible to create a scene and run it.
Becoming Human: Artificial Intelligence Magazine, James Montantes
from
If you are working on an AI project, then it’s time to take advantage of NVIDIA GPU accelerated libraries if you aren’t doing so already. It wasn’t until the late 2000s when AI projects became viable with the assistance of neural networks trained by GPUs to drastically speed up the process. Since that time, NVIDIA has been creating some of the best GPUs for deep learning, allowing GPU accelerated libraries to become a popular choice for AI projects.
If you are wondering how you can take advantage of NVIDIA GPU accelerated libraries for your AI projects, this guide will help answer questions and get you started on the right path.
The United States Patent and Trademark Office’s (USPTO) Office of the Chief Economist released the Artificial Intelligence Patent Dataset (AIPD)—identifying which of the 13.2 million United States patents and pre-grant publications include artificial intelligence (AI)—to help enable researchers, policymakers, and the public explore the impacts of AI on invention.