Concentrations of the coronavirus in wastewater in parts of the Las Vegas Valley are approaching levels last seen during the winter peak of the disease in Nevada, even though key COVID-19 metrics like new cases and hospitalizations remain far lower, researchers say.
Two of the seven valley locations where wastewater samples are being regularly tested for SARS-CoV-2, the virus that causes COVID-19, are showing virus concentrations close to the peak of the winter surge, according to Daniel Gerrity, principal research microbiologist for the Southern Nevada Water Authority.
“Unless something changes, we will be at that level across all of Southern Nevada relatively soon,” he predicted.
Dialogue without data is a waste of time. That’s what members of a new U.S. National Academies of Sciences, Engineering, and Medicine panel looking into the threat posed by other countries trying to steal federally funded research yesterday warned a panel of U.S. government watchdogs.
Members of the National Science, Technology, and Security Roundtable—formed last year to promote discussions among federal officials, academic leaders, and national security experts—complained that presentations from a trio of major research agencies lacked the baseline data needed to determine the scope of the problem and what the research community can do to minimize risks.
“I hope you can sense our frustration,” Maria Zuber, a co-chair of the roundtable and vice president for research at the Massachusetts Institute of Technology, said at the end of a 2-hour online session. “It’s impossible for us to gain an understanding of the challenge we face with the information we are being given.”
Many of the problems to which ML can be applied are tasks whose conditions are obvious to humans. That’s because we’re trained to notice those problems through observation—which cat is more floofy or at what time of day traffic gets the most congested. Other ML-appropriate problems could be solved by humans as well given enough raw data—if humans had a perfect memory, perfect eyesight, and an innate grasp of statistical modeling, that is.
But machines can do these tasks much faster because they don’t have human limitations. And ML allows them to do these tasks without humans having to program out the specific math involved. Instead, an ML system can learn (or at least “learn”) from the data given to it, creating a problem-solving model itself.
This bootstrappy strength can also be a weakness, however. Understanding how the ML system arrived at its decision process is usually impossible once the ML algorithm is built (despite ongoing work to create explainable ML). And the quality of the results depends a great deal on the quality and the quantity of the data. ML can only answer questions that are discernible from the data itself. Bad data or insufficient data yields inaccurate models and bad machine learning.
The university also will propose new initiatives to advance its offerings in computing and digital literacy, needed for all students and all disciplines of study, such as precision agriculture, and central to supporting economic growth in Wyoming.
It will propose to launch a new School of Computing, as well as a campuswide Center for Entrepreneurship and Innovation (CEI) and the Wyoming Outdoor Recreation, Tourism and Hospitality (WORTH) Initiative. These three linked initiatives are aimed squarely at training students in areas important for advancing key markets for the future economy of Wyoming, while propelling the new Wyoming Innovation Partnership.
The World Health Organization (WHO) recently released a report presenting guidance around the ethical use of artificial intelligence (AI) in the health sector.
The lack of a general consensus for ethical use of AI has sparked debate among those in the industry, with some raising concerns about the implications of this technology. This has led to organizations seeking to offer their own solutions, such as the National Institute of Standards and Technology’s recent proposal to reduce bias in the use of AI.
The WHO’s report, titled Ethics and Governance of Artificial Intelligence for Health, seeks to address similar concerns — as well as potential benefits — of AI’s potential roles in the health sector.
For the third time since the pandemic began, the U.K. is betting against the rest of the world.
The first time brought catastrophic consequences. The U.K. flirted with the idea of herd immunity and delayed a decision to go into a full lockdown despite the world urging it to, a step that ended up costing the lives of thousands.
The second time was less of a failure. The U.K. was the first western country to start a Covid-19 vaccination campaign. It later decided to take yet another controversial step and was the first to extend the period of time between the two vaccine doses. This decision was criticized by many but later found to have at least some advantages.
Now, Britain has done it again. For the third time, it has become the so-called guinea pig that will either end up teaching the world what to do, or what to avoid.
You’ve heard the term “machine learning” as it’s becoming recognized as a valuable tool to help physicians in diagnosing and managing patients, as well as other aspects of medicine.
But do you understand what that buzzword really means?
Two experts recently explained the fundamentals of machine learning, what it means in the clinical setting and the possible risks of using the technology during an education session—“Machine Learning: An Introduction and Discussion of Medical Applications”—that took place during the June 2021 AMA Sections Meetings and was hosted by AMA Medical Student Section.
As COVID-19 gripped the nation last spring and graduates shuddered at the thought of a pandemic job market, three professors from Western’s Huxley College of the Environment came together to create a cure for some of those fears: the Data Science for Environmental Applications certificate program.
The work of three faculty members in the Environmental Science department, Jenise Bauman, Rebecca Bunn and Andy Bunn, culminated in a nine-month-long certificate course intended to give participants tangible and valuable data skills; the program’s first cohort finished the program this past school year.
Starting this fall, Kansas residents 60 and older can no longer audit Wichita State classes for free.
Like tuition-paying students, senior auditors will now be charged a fee for every class they take, ranging from roughly $8 a credit hour for liberal arts and science courses to around $68 a credit hour for business courses. Most WSU courses are three credit hours.
The new program, dubbed CSUESS (which stands for California State University Connectivity Contributing to Equity and Student Success,) is designed to create more equitable conditions and opportunities for students at CSU.
The university points out that half of all CSU undergrads receive Pell Grants — grants that are awarded to students who display “exceptional need” — and nearly a full third are the first in their family to pursue a bachelor’s degree.
“CSUCCESS will assure that students have immediate access to innovative, new mobile tools they need to support their learning, particularly when faced with the lingering effects of the pandemic,” said CSU Chancellor Joseph I. Castro. “The new initiative will establish a foundation for their achievement and has the potential to play a key role in eliminating stubborn equity gaps among our talented and diverse students. In addition to truly addressing equity and access, we see iPad Air as a powerful tool to prepare our students for their future careers.”
The brutal heat wave that killed hundreds of people last week in the Pacific Northwest and Canada would have been “virtually impossible” without climate change, a study released on Wednesday said, offering the latest evidence that global warming is making extreme heat more common and more dangerous.
In a preliminary analysis, called a rapid attribution study, an international team of 27 scientists examined the links between human-induced climate change and the heat wave, finding that the blistering temperatures were made at least 150 times more likely to occur because of climate change.
The last two decades have seen an uptick in people choosing to monitor their health using wearable technologies such as Fitbits and Apple Watches. The wearable technology market is valued at $116.2 billion, and is projected to reach $265.4 billion by 2026. Some wearable devices gather not only information like calories burned and steps taken, but heart rate, blood pressure and sleeping patterns. These data points are continuously collected from users, but because they haven’t been validated at a clinical level, the data aren’t necessarily usable for medical professionals.
A multi-institution team led by University of Arizona electrical and computer engineering professor Janet Roveda is building a future in which wearable devices will allow clinicians to gather patient data remotely and provide “care in place” so patients don’t need to leave their homes. The team founded the Center to Stream Healthcare in Place, or C2SHIP, which was first selected as a National Science Foundation Industry-University Cooperative Research Center, or IUCR, in 2018 and received $15,000 in startup funding.
C2SHIP recently received a continuing NSF grant of $3 million, with $1.125 million earmarked for UArizona.
The Kansas City Star; Jonathan Shorman, Jeanne Kuang, Jake Kincaid, and Derek Kravitz
from
A joint investigation by The Kansas City Star and Columbia University’s Brown Institute for Media Innovation reveals how June became a lost month in the fight to slow the spread of delta across Missouri. Thousands of pages of internal emails and other documents from 19 local health departments trace the growing alarm and a sense of near-resignation among officials about their chances of halting the advance of the variant.
The consequences of the squandered month will last well into summer. CoxHealth, a major Springfield hospital, told The Star it’s bracing for hospitalizations to rise for weeks to come. Delta is still spreading and has now been found in the Kansas City and St. Louis areas, though state officials hope higher vaccination rates in those places will limit increases in cases. Schools will also begin next month with some parents in open rebellion against imposing mask requirements, even with delta all but certain to continue circulating.
The W3C’s members do it all by consensus in public GitHub forums and open Zoom meetings with meticulously documented meeting minutes, creating a rare archive on the internet of conversations between some of the world’s most secretive companies as they collaborate on new rules for the web in plain sight.
But lately, that spirit of collaboration has been under intense strain as the W3C has become a key battleground in the war over web privacy. Over the last year, far from the notice of the average consumer or lawmaker, the people who actually make the web run have converged on this niche community of engineers to wrangle over what privacy really means, how the web can be more private in practice and how much power tech giants should have to unilaterally enact this change.
Online October 21-22. “The Fourth Annual Data Science Connect Conference will combine our annual individual conferences across the south to make up one large southern conference that will showcase the wide variety of advanced data science applications across industry verticals and technical disciplines.” [$$$]
“In the UIST Student Innovation Contest (aka the “SIC”), we explore how novel input, interaction, actuation, and output technologies can augment interactive experiences! This year, in partnership with Sony Interactive Entertainment, we are seeking students who will push the boundaries of input and output techniques with the TOIO micro robot platform.” Deadline to apply is August 2.
“XPRIZE Carbon Removal is aimed at tackling the biggest threat facing humanity – fighting climate change and rebalancing Earth’s carbon cycle. Funded by Elon Musk and the Musk Foundation, this $100M competition is the largest incentive prize in history, an extraordinary milestone.” Deadline for submissions is October 1.
SPONSORED CONTENT
The eScience Institute’s Data Science for Social Good program is now accepting applications for student fellows and project leads for the 2021 summer session. Fellows will work with academic researchers, data scientists and public stakeholder groups on data-intensive research projects that will leverage data science approaches to address societal challenges in areas such as public policy, environmental impacts and more. Student applications due 2/15 – learn more and apply here. DSSG is also soliciting project proposals from academic researchers, public agencies, nonprofit entities and industry who are looking for an opportunity to work closely with data science professionals and students on focused, collaborative projects to make better use of their data. Proposal submissions are due 2/22.
You remember that time you passive-aggressively completed a story in the most useless way possible to check that checkbox? That’s most monitoring systems.
Take a look at your project’s compilation warnings. If you’re using NPM, you’ll see the impossible to resolve deprecation warnings a mile long and quickly realize how much people ignore issues. Still, something has everyone convinced that people actually want to fix things. What leads to this massive disconnect? Bad monitoring. Let’s go over traits of a good and bad system.
we’ve built Android ML Platform – an updateable, fully integrated ML inference stack. With Android ML Platform, developers get:
Built in on-device inference essentials – we will provide on-device inference binaries with Android and keep them up to date; this reduces apk size
Optimal performance on all devices – we will optimize the integration with Android to automatically make performance decisions based on the device, including enabling hardware acceleration when available
A consistent API that spans Android versions – regular updates are delivered via Google Play Services and are made available outside of the Android OS release cycle
MIT Sloan Management Review; Roger Hoerl, Diego Kuonen, and Thomas C. Redman
from
In an earlier article, we pointed out the major structural flaw hindering many data science programs — the inherent conflict between data science groups (which we termed the lab) and business operations (termed the factory). To resolve that conflict, we proposed a data science bridge: an intermediary group headed by a person with the title innovation marshal tasked with ensuring better communication and integration between the two groups and surfacing the best ways to make inventions by the lab fit into the needs of the factory.
This article builds on that structural solution by addressing the issues associated with managing the process at an enterprise level. Proper management includes driving collaboration, developing human capital, ensuring data quality, managing the project portfolio, and ensuring the business impact of all data science efforts. We propose that this overall data science management process be owned by the person leading the data science bridge.