“This major award from the National Science Foundation will allow Auburn and collaborating institutions to foster a more diverse workforce while improving educational opportunities for disabled students,” said James Weyhenmeyer, Auburn’s vice president for research and economic development.
What happened when an economics researcher, a bioengineer and a tech entrepreneur joined forces in a pandemic? They envisaged a research-grant system with an application form that can be completed in less than 30 minutes, a decision-making process that takes just 48 hours, and funding that arrives within a week — and then set up the system in 10 days, using donations from philanthropists.
The scheme, known as Fast Grants, launched in April 2020 and received 4,000 applications in its first week. It was created by Tyler Cowen, an economics researcher at George Mason University in Fairfax, Virginia; Patrick Collison, co-founder of online payment processing platform Stripe; and Patrick Hsu, a bioengineer at the University of California, Berkeley.
Now the trio have released the results of a survey about how the speedy funding benefited scientists’ work in the early stages of the COVID-19 pandemic and how conventional routes of funding might be overly bureaucratic.
University of Tennessee, Department of Physics & Astronomy
from
The University of Tennessee, Knoxville, is partnering with institutions across the country in a multi-million dollar project to study the merger of neutron stars—objects with mass 1.5 to 2.5 times that of our sun that serve as a model for atoms and generate elements heavier than iron.
The National Science Foundation is investing $3.25 million to create the Nuclear Physics of Multi-Messenger Mergers research hub, which capitalizes on Nobel Prize-winning science and advanced computational tools to learn about the nature of matter in ways not possible in a standard laboratory. The hub will emphasize training a diverse cadre of new doctoral graduates to ensure a next generation of scientists will broaden the field and carry the research forward.
https://defi-learning.github.io, open to everyone! Bringing together finance and computer science. Starting Aug 26! Sign up, spread the word, let us know your suggestions!
Facebook has blocked a team of New York University researchers studying political ads and COVID-19 misinformation from accessing its site, a move that critics say is meant to silence research that makes the company look bad.
The researchers at the NYU Ad Observatory launched a tool last year to collect data about the political ads people see on Facebook. Around 16,000 people have installed the browser extension. It enables them to share data with the researchers about which ads the users are shown and why those ads were targeted at them.
Facebook said on Tuesday that it had disabled the researchers’ personal accounts, pages, apps and access to its platform.
“NYU’s Ad Observatory project studied political ads using unauthorized means to access and collect data from Facebook, in violation of our terms of service,” Mike Clark, Facebook’s product management director, wrote in a blog post.
The regulatory framework governing these tools is complex. FDA regulates some—but not all—AI-enabled products used in health care, and the agency plays an important role in ensuring the safety and effectiveness of those products under its jurisdiction. The agency is currently considering how to adapt its review process for AI-enabled medical devices that have the ability to evolve rapidly in response to new data, sometimes in ways that are difficult to foresee.
This brief describes current and potential uses of AI in health care settings and the challenges these technologies pose, outlines how and under what circumstances they are regulated by FDA, and highlights key questions that will need to be addressed to ensure that the benefits of these devices outweigh their risks. It will take a collective effort by FDA, Congress, technology developers, and the health care industry to ensure the safety and effectiveness of AI-enabled technology.
The first thing you have to understand about your inbox is that the things you do with emails have a direct impact on whether you’ll even see the next one. The three major companies behind the email platforms used by most Americans—Google (Gmail), Microsoft (Outlook and Hotmail), and Verizon (AOL and Yahoo Mail)—all have designed their products to protect your inbox with software that suppresses messages you don’t want. Opening an email and clicking on a link inside it might tell the software’s algorithms that you want more like it. So might scrolling down the body of an email, or spending a certain amount of time reading it, or starring it, or filing it into a folder. Ignoring other messages, meanwhile, can lead the mailbox software to start junking them, or even blocking the senders.
Email is one of the few ways companies can reach their customers directly. In fact, people overwhelmingly say that the way they want to hear from brands is by email, Chad S. White, the head of research for Oracle Marketing Consulting, told me. That’s why the mailbox software started suppressing messages—to protect people from companies’ temptation to send too many emails. In response, email marketers obsess over “deliverability,” or how the content and frequency of their emails might help those messages actually hit your inbox in the first place. But that process has created new and weird feedback loops, in which some companies and certain messages might be able to reach your inbox more readily than before, while others get junked—condemned to spam, deleted, or the like—before you see them.
As a result, your personal inbox gradually has become less like a mailbox and more like a wormhole into every business relationship you maintain: your bank; your utility provider; your supermarket; your favorite boutiques, restaurants, housewares providers, and all the rest. It’s your own digital commercial district
University of California-Santa Barbara, The UCSB Current
from
More than 700 imaging satellites orbit the Earth, and every day they beam vast amounts of information to databases on the ground. There’s just one problem: While the geospatial data could help researchers and policymakers address critical challenges, only those with considerable wealth and expertise can access it.
Now, a team of scientists, including UC Santa Barbara’s Tamma Carleton(link is external), has devised a machine learning system to tap the problem-solving potential of satellite imaging. The tool employs low-cost, user friendly technology that could bring access and analytical power to researchers and governments worldwide. The study appears in the journal Nature Communications(link is external).
Dr. Nicki Washington and Dr. Shaundra Daily of Duke University have been awarded $10 million from the National Science Foundation to create The Alliance for Identity-Inclusive Computing Education (AIICE).
AIICE focuses on access and retention for historically underrepresented groups in computing education by implementing systemic changes; combining social science with computer science to boost identity awareness; creating an all-inclusive environment; and working to implement identity-inclusive, policy-driven changes at the K-12 level in computer science education.
The Office of Institutional Technology at Duke will partner with universities around the U.S. to implement these changes. The Duke team said that this work will could impact 525,000 high school students and 35,000 undergraduate computer science students across the nation.
Like many people, Yota Batsaki spent the last year learning about the natural world around her. Unlike the majority of them, the comparative literature scholar’s interest was mostly nurtured online rather than during long walks to escape lockdown cabin fever.
In March, Batsaki, executive director of Harvard’s Dumbarton Oaks research institute, library, museum, and garden in Washington, D.C., and a group of colleagues launched the Plant Humanities Lab — a digital repository of information and narrative storytelling on the historical and scientific lives of plants like the peony, turmeric root, and the banana.
The project is part of a broader movement in humanities research that engages with critical questions of climate change and knowledge production. Researchers come from the social sciences, biology, botany, and other disciplines that rarely converge in academia. Through collaborative storytelling and information-gathering, they hope to shed light on the historical relationships between humans and their environments — and improve our current and future relationships with nature.
A team of researchers from Oxford University, UK, put forward a series of possible ideas and suggestions to reach the final 25% in terms of greenhouse emissions and achieve Net-Zero. This included, for example, stimulating research into sustainable plastics and encouraging the use of alternative food proteins, such as plants, insects and algae.
The use of fossil fuels in transportation and industry accounts for a large amount of greenhouse gas emissions and is the primary target in our battle to achieve Net-Zero. With Net-Zero, the goal is to find a balance between the amount of greenhouse gas released and the amount that we can eliminate from the atmosphere.
However, researchers know that Net-Zero cannot be achieved simply by getting an electric car. There are still hard-to-reach emissions coming from agriculture, plastics and waste, as well as a small percentage of emissions already in the air that we cannot remove. This is known as the “Final 25%”, and if we want to reach Net Zero, we have to take these into account.
How much of the open space, sidewalks and other land in your neighborhood is shaded by trees? It depends on where you live — and, environmental advocates say, on decades of inequities that break down along class and racial lines.
Only about 3% of Sun Valley, a west Denver neighborhood where 94% of residents live in poverty, is covered by tree canopy. Just a few miles away, West Highland has a tree canopy cover of 18%, with only 12% of residents living in poverty.
Why do these disparities occur? American Forests, a conservation nonprofit, recently launched a measurement called tree equity scores for thousands of neighborhoods in the United States. A tree equity score is based on the employment rate, age, health, income, race, population density, surface temperature, and existing tree canopy of a neighborhood. If a neighborhood has a score of 100, it has achieved tree equity.
A virtual research centre, called 6G Futures, has been launched by the University of Bristol and Kings College, London.
The centre has access to over 400 world-renowned experts in telecommunications networks, cyber, Artificial Intelligence, digital humanities, social sciences and arts to help shape the future of mobile technology for individuals and society.
Over the last several years, we’ve used this access to uncover systemic flaws in the Facebook Ad Library, identify misinformation in political ads including many sowing distrust in our election system, and to study Facebook’s apparent amplification of partisan misinformation. 2/4
Freedom to Tinker blog; Orestis Papakyriakopoulos, Ashley Gorham, Eli Lucherini, Mihir Kshirsagar, and Arvind Narayanan.
from
Facebook’s latest move to obstruct academic research about its platform by disabling NYU’s Ad Observatory is deeply troubling. While Facebook claims to offer researchers access to its FORT Researcher Platform as an alternative, that is an illusory offer as we have recently learned first hand in connection with our ongoing research project that studies how the social media platforms amplified or moderated the distribution of political ads in the 2020 U.S. elections.
New award-winning research from the Cornell Ann S. Bowers College of Computing and Information Science explores how to help nonexperts effectively, efficiently and ethically use machine-learning algorithms to better enable industries beyond the computing field to harness the power of AI.
“We don’t know much about how nonexperts in machine learning come to learn algorithmic tools,” said Swati Mishra, a Ph.D. student in the field of information science. “The reason is that there’s a hype that’s developed that suggests machine learning is for the ordained.”
Mishra is lead author of “Designing Interactive Transfer Learning Tools for ML Non-Experts,” which received a Best Paper Award at the annual ACM CHI Virtual Conference on Human Factors in Computing Systems, held in May.
Women are more likely to start a research career now than they were 20 years ago, reveals a longitudinal study of the publishing records of millions of researchers around the world. But they are less likely to continue their academic careers than are their male contemporaries, and in general publish fewer papers.
Ludo Waltman, a quantitative scientist at Leiden University in the Netherlands, and his colleagues took a deep dive into the huge Scopus citation and abstract database, hosted by Elsevier. They looked at the publication careers of some six million researchers globally who had authored at least three papers between 1996 and 2018. The team posted its findings on the preprint server arXiv.org.
The authors found that the proportion of women starting a career in science rose over time. In 2000, 33% of researchers starting their publishing career were women; that grew to 40% in recent years (see ‘Gender gap’). Waltman says that although the results are not surprising, it’s important that we now have concrete statistics confirming the trend for many countries and scientific disciplines.
Loyola University Maryland will permanently offer its Master of Science in Data Science program in a fully online format starting this fall.Loyola University Maryland will permanently offer its Master of Science in Data Science program in a fully online format starting this fall.
As climate change intensifies the devastation from storms, wildfires and droughts, artificial intelligence (AI) and digital tools are increasingly being seen as a way to predict and limit its impacts.
Governments, tech firms and investors are showing growing interest in machine-based learning systems that use algorithms to identify patterns in data sets and make predictions, recommendations or decisions in real or virtual settings.
In June, the Rise Fund, an impact investing arm of private equity firm TPG, invested $100 million in a data and AI-driven “nowcasting” system devised by Kentucky-based startup Climavision to predict weather patterns with granular accuracy.
And an intergovernmental roadmap on AI’s role in fighting global warming is due to launch at November’s COP26 climate summit in Scotland.
But AI can also be highly energy-intensive and environmentally damaging, say critics who warn that the tech could be a costly distraction from more effective ways of tackling climate change.
Howard University has selected a team of private real estate development firms to construct a 260,000-square-foot laboratory and office building located at what will become the northwest corner of Georgia Avenue and Bryant Street NW, adjacent to Howard’s main campus in Washington, D.C. The National Research Center for Health Disparities will be a privately developed and funded project designed to attract pharmaceutical companies and biomedical research organizations that are focused on finding solutions to chronic illnesses, particularly those affecting communities of color.
“Howard has a long history of training problem solvers who are prepared to meet and combat the world’s most significant obstacles,” said Howard University President Wayne A. I. Frederick. “This national research center will support the University’s Howard Forward strategic plan by expanding our reach and creating a community where the world’s best minds, thought leaders and scientists can collaborate in one place to solve historic and contemporary health challenges and make our world a healthier and safer place.”
Newswise, U.S. Department of Energy Office of Science
from
A collaborative team of scientists from The University of Texas – Austin, the University of Notre Dame, Louisiana State University, and Lawrence Berkeley National Laboratory will address mitigation strategies for gulf coastal flooding events due to extreme weather with artificial intelligence and machine learning (AI/ML) techniques that combine experimental data with computer simulations.
A collaborative team of scientists from the University of Connecticut and Lawrence Berkeley National Laboratory will couple experimental data with simulations using AI/ML techniques to design, manufacture, and test new materials with uniquely designed properties for potential applications in batteries, sensors, and energy storage.
A collaborative team of scientists from the University of Southern California, Argonne National Laboratory, and Lawrence Berkeley National Laboratory will develop AI/ML based methods to simulate and experimentally verify the performance of large, distributed computing infrastructures.
A few weeks ago, Jake O’Neal and his daughter traveled from their home in Monument, Colo., to visit the Utah State University campus where she will start her freshman year at the end of this month.
They were surprised to discover the apartment complex where she planned to live a short walk to campus wasn’t finished. Not even close.
“There was no work being done, and you could clearly tell the building wasn’t ready,” O’Neal told me Friday. “Nobody was in their office. Nobody was answering the phone. So we said we’d better start looking into this a little closer.”
National Bureau of Economic Research, Working Paper; Duha Tore Altindag, Elif S. Filiz & Erdal Tekin
from
The pandemic has revived the longstanding debate about the effect of online versus face-to-face instruction on student achievement. The goal of this paper is to provide new evidence on the impact of online versus face-to-face instruction on student learning outcomes, using rich, transcript-level longitudinal data from a public university. We pay particular attention to eliminating selection bias by incorporating student and instructor fixed effects into the empirical analysis as well as to separate out the impact of online versus in-person education from COVID-19-related confounding factors. Our results indicate that students in face-to-face courses perform better than their online counterparts with respect to their grades, the propensity to withdraw from the course, and the likelihood of receiving a passing grade. However, our investigation also reveals that instructor-specific factors, such as leniency in grading or actions towards preventing violations of academic integrity, play a significant role in determining the studied relationship. Without accounting for these instructor-specific factors, the relationship is severely biased, causing one to mistakenly conclude that online instruction is better for student learning than face-to-face instruction. Our analysis further documents a rise in grades associated with COVID-19-triggered changes to student assessment policies embraced by universities as well as instructors adopting a more flexible approach to grading. While these developments led to an increase in grades for all students overall, those who began Spring 2020 in face-to-face courses appear to have benefitted more generously from them. Finally, an auxiliary analysis shows that living in neighborhoods with better broadband technology is associated with a larger increase in grades among students who had to switch from in-person to online instruction during COVID-19. This finding supports the argument that unequal access to technology might have caused learning disparities to get deepened during the pandemic.
Purdue University’s Board of Trustees on Friday (Aug. 6) gave approval to plan, finance, construct and award construction contracts for renovations to two buildings on the West Lafayette campus – Schleman Hall of Student Services and Stewart Center.
Schleman Hall will become the new and expanded home for Purdue’s rapidly growing Data Science program – with 101,000 gross square feet of teaching labs, group workspaces, study space and offices – close to the College of Science’s departments of Computer Science, Mathematics, and Statistics, and building on sustained university investment that has supported unprecedented growth in these programs. [$52,800,000]
o find out more about the company’s approach to data, its partnerships, and why it chose build over buy for its machine learning technologies, we chatted with Tristan Burns, Pizza Hut’s global head of analytics. … Tristan Burns: You’re right — Pizza Hut was the first brand to create an online ordering experience. That was back in 1994, in California. You could submit an order online, it would end up in a store, and it’d be prepared and sent to your house, which was pretty cool. And while Pizza Hut was quite early to the ecommerce game, I think no one would mind me saying we were kind of eclipsed by Dominos in the 2000s. They came out swinging, saying they were a tech company that makes pizzas and with some pretty innovative technologies. Now Pizza Hut Digital Ventures, the organization I work for that is specific to Pizza Hut International, is taking a tech-first approach to redesigning, reimagining, and recreating our ecommerce capabilities. We’re in the process of building and scaling out some pretty robust solutions. It’s a very, very data-centric and very customer-centric approach.
In at least one sense, every man is an island, and every man-island has lots of inhabitants. But how they relate to each other is unclear.
A new project at Rice University’s Brown School of Engineering seeks to define the social order of such bacterial communities, collectively known as microbiomes. The initiative has received backing from the National Science Foundation in the form of a five-year, $2.8 million grant.
Led by Rice computer scientist Todd Treangen, the researchers will develop novel computational approaches to track environmental microbiome dynamics over time, across species and after perturbations. The researchers will start with biofilm-based “species abundance networks” on scaffolds and observe how they form their own genome-exchange networks.
University of California System, Office of Scholarly Communication
from
In March 2020 the COVID Tracking Project at The Atlantic took on the responsibility of posting COVID-19 cases, testing, hospitalization, and death counts in the United States. The effort, led by The Atlantic and a team of 500 volunteers, proved to be invaluable when this key information was not otherwise reliably available as the worldwide pandemic unfolded.
The entirety of the COVID Tracking Project archives have been donated to UCSF, in collaboration with California Digital Library. Further, the raw data have been published in Dryad, a data repository that University of California is closely tied to.
Penn State University, Institute for Computational and Data Sciences
from
State College PA, and Online October 6-7. “Join the Institute for Computational and Data Sciences as we bring together researchers from around the U.S. to discuss data, equity, reproducibility, and other topics related to fairness more.” [registration required]
North Carolina State University, Crop and Soil Sciences News
from
Raleigh NC and Online October 9-10. “The event will center around the uses of the new BenchBot, a low-cost robot intended as a generalizable platform for sensing growing plants on greenhouse benches.” [registration required, must be a student]
Houston TX and Online October 25-27. “Recognizing that discovery and innovation happens at interfaces of disciplines and communities, the conference aims to bring together a diverse set of people from multiple communities spanning academia and industry.” Deadline for abstracts submissions is August 20.
Find out more, and see how you can share your work via Submittable,
SPONSORED CONTENT
The eScience Institute’s Data Science for Social Good program is now accepting applications for student fellows and project leads for the 2021 summer session. Fellows will work with academic researchers, data scientists and public stakeholder groups on data-intensive research projects that will leverage data science approaches to address societal challenges in areas such as public policy, environmental impacts and more. Student applications due 2/15 – learn more and apply here. DSSG is also soliciting project proposals from academic researchers, public agencies, nonprofit entities and industry who are looking for an opportunity to work closely with data science professionals and students on focused, collaborative projects to make better use of their data. Proposal submissions are due 2/22.
Trafilatura is a Python package and command-line tool which seamlessly downloads, parses, and scrapes web page data: it can extract metadata, main body text and comments while preserving parts of the text formatting and page structure. The output can be converted to different formats.
It is not worth doing a PhD unless you doing do it a research intensive institution with a well known adviser. Second tier schools are not worth it. You are currently not competitive for those. First most programs worth their salt need a minimum gpa of 3.0. Second, most good universities need to see some evidence of prior research experience. You don’t really seem to have that. What is this publication you speak of that you have from undergrad? You need at least 2 letters that will speak to your research prowess. Do you have those?
Noone cares about your compelling personal statement and in fact, PhD research statements are mostly about research interests and fit and not your personal life.
I would suggest thinking about a well established MS programs (many of them have 3.0 minimum gpa requirements as well but may make exceptions)
Thousands of satellites are orbiting the Earth and observing conditions in the atmosphere, the oceans, and the land surface. Vast amounts of information are being collected all the time, but raw data needs manipulation before it becomes useful for scientific analysis. Python is a programming language that can be used to process satellite data sets for Earth science research. Earth Observation Using Python: A Practical Programming Guide is a new book recently published in AGU’s Special Publications series. It presents an introduction to basic Python programming that can be used to create functional and effective visualizations from earth observation satellite data sets. We asked the author about her vision for the book and how people can best utilize it.
Large research and infrastructure projects increasingly involve significant collaborations across many sites with diverse teams. And while simply executing a series of proposed tasks from start to finish can yield strong reports and publications, this approach may thwart sustainable outcomes.
“Sustainability” means that research outcomes live on so that others can utilize them for as long as they are useful to a scientific enterprise. Researchers who think like entrepreneurs and comfortably revisit and “pivot” their activities as necessary can attain sustainable results. Pivoting should be a methodological process — not a demotivating “flavor of the week” that gives a team proverbial whiplash.
Researchers who seek to take a measured and justified approach to pivoting can borrow entrepreneurial techniques like the Entrepreneurial Operating System (EOS), which is described in Gino Wickman’s Traction. The EOS applies practical methods in a simple form with minimal overhead. I have personally been involved in the application of this method for three distinctly different projects—the Science Gateways Community Institute (SGCI), the Network for Computational Nanotechnology Cyber Platform (NCN-CP), and HUBzero®—and can attest that it fulfills its promise of enabling quick decisions and avoiding complexity.