“Imagine you go to the supermarket, and it’s just like, price tags are digital. And they depend on who you are,” says Catalin Voss, a Stanford University PhD candidate in artificial intelligence who, as I wrote in 2014, co-founded a company in his teens to measure emotions through facial tracking. “That’s, that’s not the world that we want to live in.”
Setting ethical boundaries in such a brave new world promises to be complicated.
Rana el Kaliouby, co-founder and CEO of Affectiva, a leading emotion recognition company, cites the hypothetical example of her teenage daughter using her iPhone. “Say her phone can now detect that, you know, she looks like she’s depressed,” she says. “Should her phone text mom and say: ‘Hey, Jana looks depressed, you should do something about this.’ And should my daughter have a say?”
Students who don’t like Indiana University’s COVID-19 vaccine requirement can go elsewhere for their education.
That was the message delivered by the 7th Circuit Court of Appeals in a ruling issued Monday that will allow the public university’s requirement that all students and employees receive a COVID-19 vaccine before the start of the fall semester to stand.
The court said that colleges and universities may decide what is necessary to keep students safe in the decision denying a request for an injunction made by a group of eight students seeking to block the mandate, alleging that it violates their constitutional rights.
Should people who volunteer for genomic studies be told about unrelated disease mutations that turn up in their sequence data? The decadeslong debate about such “incidental findings,” which can include genes that boost risk for cancer or heart disease, flared up again last week after bioethicists at the National Institutes of Health (NIH) published a study showing many participants who at first refuse those findings can change their minds. Controversially, it went on to suggest all research participants should routinely be told about their genetic risks for conditions that can be prevented or treated—a change from current practice.
The controversy pits researchers, many of them physicians, who see incidental findings as an opportunity to boost the health of the millions who have had their genomes analyzed, against others, mainly bioethicists, who stress the need to respect study participants’ hesitation about receiving information that might expose them to genetic discrimination or simply be unwelcome. Deepening the divide, the study showed Black participants were more likely to refuse incidental results. “That strengthens the argument for saying we’ve really got to get true consent, opt-in consent from everyone,” says Susan Wolf, a lawyer who teaches health law and bioethics at the University of Minnesota Law School.
The state department of education’s new strategic plan would require every high school to offer at least one computer science class by 2023, with scholarships and incentives for teachers to learn how to teach it.
In this month’s installment of the Innovation of the Month series, we highlight EquiTensors, a project that is reflecting on and raising awareness of applications, opportunities and potential misuses of data science and AI applied to mobility and transportation, specifically as it refers to race, equity and diversity. MetroLab’s Josh Schacht spoke with the leader of the project, Bill Howe, University of Washington Associate professor in the Information School, adjunct associate professor in Computer Science and Engineering, and associate director and senior data Science fellow at the UW eScience Institute.
In July 2016, two cardiologists and a handful of computer scientists and developers rolled up in a minivan to Apple’s special projects office in Cupertino, Calif., with a big idea to show a company with grand designs on transforming health care.
The team from Johns Hopkins University had received a rare invitation from Apple to workshop their mock-ups for Corrie, an app to guide heart attack patients through the maze of recovery. For a week, Apple and the Hopkins team labored on the design, carefully talking through the minutiae of each feature.
Corrie is designed to make everything that’s hard about managing recovery after a heart attack far easier for patients — and, in turn, keep them out of the hospital. Once home, the app helps track their vital signs and activity data with the help of an Apple Watch and a Bluetooth blood pressure cuff. It sends reminders when a patient needs to take a pill or head in for a follow-up appointment, and also serves as a hub of critical health information, including guidance on diet and exercise, that’s often lost in the chaos of a hospital discharge.
In my latest project, I had the opportunity to work on a project based on electrical stimulation for a bidirectional BCI. In this article, I will define what we mean by bidirectional BCI, present several commercial applications for bidirectional BCIs, explain the concept of brain augmentation associated with BCIs, go through some disadvantages of electrical stimulation and share my opinion about the future commercial applications using bidirectional BCIs.
Artists reaching number 1 is as much a mechanism of the labels behind them as it is due to their artistic capability. The narrative worth exploring is the artistic process and what that means in the modern age.
The culture of creativity is being defined by platforms and the audiences on them. Platforms are reaping the rewards of modern consumption and driving the growth of the global industry. Global audiences theoretically give any artist a shot at putting their music in front of millions. However, as the UK Government’s DCMS enquiry put it, the talent behind this growth is losing out. Funds for creators seems the be the strategy of the hour, examples being:
Facebook’s $1 billion creator fund
YouTube’s $100 million YouTube shorts creator fund
U.S. college endowments posted their strongest performance since 1986 as stocks and alternative investments surged.
The median return before fees was 27% for the 12 months through June, according to data to be published Tuesday by Wilshire Trust Universe Comparison Service. The biggest funds, those with assets of at least $500 million, did even better, with a median gain of 34%.
The investment results, on top of an infusion of federal stimulus funds, are a welcome financial balm for schools that have grappled with revenue declines as the pandemic curtailed enrollment.
Officials in the Homeland Security Department’s Science and Technology Directorate are being intentional about how their hub supports the entire enterprise in pursuing artificial intelligence and machine learning capabilities.
In an 18-page AI/ML Strategic Plan, released by the research and development arm on Friday, they point to inherent risks associated with advancing those novel technological capabilities and discuss how S&T aims to move forward responsibly.
“We spent probably 10 months developing this document, and in the first two or three months, it was more of a brainstorming session to determine how we were going to move forward and how we were going to frame this out,” Acting Deputy Director of DHS Technology Centers Division John Merrill told Nextgov in an interview on Tuesday. “Over the course of those early months, we had numerous sessions and discussions—looking at actual AI/ML capabilities and use cases, then the operational components coming back and giving us the input in terms of their expectations or what they wanted to do.”
Facebook Inc. has disabled the personal accounts of a group of New York University researchers studying political ads on the social network, claiming they are scraping data in violation of the company’s terms of service.
The company also cut off the researchers’ access to Facebook’s APIs, technology that is used to share data from Facebook to other apps or services, and disabled other apps and Pages associated with the research project, according to Mike Clark, a director of product management on Facebook’s privacy team.
Scientists have turned to artificial intelligence (AI) models during the COVID-19 pandemic to predict the increase, decrease, and spread of infection. Typically, the models depend on fixed assumptions such as an externally given transmission rate for the virus and a specific pattern to human movements that supposes two people will meet with a given frequency. While this approach can shed light on the situation, it is missing a key component, says Lin William Cong, Graduate School of Management.
“We know that the spread of the disease and the movement of people are driven by economic incentives,” says Cong (pronounced Ts-óh-ng). “If confirmed cases rise, I will stay home because I worry about the risk of catching COVID. But if I lost my job for three months, I have to go out and make a living despite the pandemic. There are some very good AI predictive models that look at the dynamics of COVID and predict how employment will evolve, but they typically look at these issues separately. What if we allow economic incentives to be taken into consideration as part of the COVID model, too?”
To explore this, Cong recently joined with Beijing-based researchers Ke Tang, at Tsinghua University, and Jingyuan Wang and Bing Wang, both at Beihang University, to design a pandemic AI model that includes economic factors. Their model combines a Google community mobility measure that summarizes the movements of people; an employment model based on the assumption that if the employment rate is low, the incentive to work is high; and an epidemiology model on the spread of the disease.
The new school will include transitional academic divisions, university-wide cross-cutting themes organized into institutes and an accelerator focused on solutions. Stanford is now launching the search for a dean to lead the new school.
When Stanford’s new school focused on climate and sustainability begins operating in fall 2022, it will include a set of transitional academic divisions that will evolve into multiple departments as the school grows; cross-cutting themes organized within institutes to draw on the expertise of the entire university; and an accelerator to drive new technology and policy solutions.
Stanford President Marc Tessier-Lavigne selected this blueprint after he and Provost Persis Drell reviewed, and slightly modified, options provided in a report (SUNetID required) from the faculty Blueprint Advisory Committee (BAC) that has been meeting since last fall to develop options and recommendations for the structure of the new school.
In a landmark decision, an Australian court has set a groundbreaking precedent, deciding artificial intelligence (AI) systems can be legally recognised as an inventor in patent applications.
“For this challenge, we are re-sharing our saliency model and the code used to generate a crop of an image given a predicted maximally salient point and asking participants to build their own assessment. Successful entries will consider both quantitative and qualitative methods in their approach. For more details on the challenge, including how to enter and the rubric we’ll use to score entries, visit the submission page on HackerOne. Aiding us in reviewing entries will be our esteemed panel of judges: Ariel Herbert-Voss, Matt Mitchell, Peiter “Mudge” Zatko, and Patrick Hall.” Deadline for submissions is August 6.
SPONSORED CONTENT
The eScience Institute’s Data Science for Social Good program is now accepting applications for student fellows and project leads for the 2021 summer session. Fellows will work with academic researchers, data scientists and public stakeholder groups on data-intensive research projects that will leverage data science approaches to address societal challenges in areas such as public policy, environmental impacts and more. Student applications due 2/15 – learn more and apply here. DSSG is also soliciting project proposals from academic researchers, public agencies, nonprofit entities and industry who are looking for an opportunity to work closely with data science professionals and students on focused, collaborative projects to make better use of their data. Proposal submissions are due 2/22.
Stanford University, Stanford Institute for Human-Centered Artificial Intelligence
from
Launched two years ago, AIMI has already acquired annotated datasets for more than 1 million images, many of them from the Stanford University Medical Center. Researchers can download those datasets at no cost and use them to train AI models that recommend certain kinds of action.
Now, AIMI has teamed up with Microsoft’s AI for Health program to launch a new platform that will be more automated, accessible, and visible. It will be capable of hosting and organizing scores of additional images from institutions around the world.
Storm surges can be deadly coastal hazards but the current historical tide gauge data that is needed to better understand them, and perhaps predict their impacts, doesn’t go far back enough in time.
That’s why University of Central Florida researchers are working to reconstruct the missing data and compile the information in a newly created online Database of Global Storm Surge Reconstructions, or GSSR. The work is detailed in a recent study in Nature Scientific Data.
The new information will improve the ability of scientists to perform storm surge flood risk assessments under present-day climate conditions.