With Arizona experiencing its hottest summer on record last year, identifying heat-mitigation strategies and solutions is already a complex issue, and the lasting effects of racially based redlining implemented throughout the 20th century only add to its complexity.
Redlining was the practice of denying loans to people of color and low-income individuals based on the financial risk of the area where they chose to live. Essentially, this process aided in the active separation of races during segregation.
On the Mapping Inequality: Redlining in New Deal America website, a map of Phoenix shows how the city’s neighborhoods were categorized by lenders in 1940.
Masked language models (MLM — a task that involves masking part of the input text then asking the model to predict the missing information) have become the presumed approach when it comes to processing text. A number of alternative approaches have recently been presented to enhance word representations with external knowledge sources. However, these models are designed and assessed in a monolingual setting only, which is limiting for obvious reasons. Iacer Calixto, a visiting academic currently working with researchers at CDS, has co-authored a project “Wikipedia Entities as Rendezvous across Languages: Grounding Multilingual Language Models by Predicting Wikipedia Hyperlinks” that proposes an entity prediction task that is language-independent as an intermediate training procedure to base word representations on entity semantics. The project originated from Iacer’s current project Improving Multi-modal language Generation wIth world kNowledgE (IMAGINE), which had an essential goal of linking language models to massive knowledge graphs (such as a Wikipedia). “Wikipedia Entities” was also recently accepted into the 2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL).
Publishing their findings today in Nature Plants, an international team of researchers led by the University of Birmingham sets out the following steps needed to use AI to harness the power of nanomaterials safely, sustainably and responsibly:
Understand the long-term fate of nanomaterials in agricultural environments – how nanomaterials can interact with roots, leaves and soil;
Assess the long-term life cycle impact of nanomaterials in the agricultural ecosystem such as how how repeated application of nanomaterials will affect soils;
Take a systems level approach to nano-enabled agriculture – use existing data on soil quality, crop yield and nutrient-use efficiency (NUE) to predict how nanomaterials will behave in the environment; and
Use AI and machine learning to identify key properties that will control the behaviour of nanomaterials in agricultural settings.
The American anthropologist Laura Bohannan once tried to paraphrase “Hamlet” for a tribe of West African bush people. Convinced that “human nature is pretty much the same the whole world over,” Bohannan chose “Hamlet” as a reliable universal archetype. It sounds good on paper, but at practically every sentence, she found her listeners raising objections and interpolations wholly outside her frame of reference.
Physicists love recreating the world in software. A simulation lets you explore many versions of reality to find patterns or to test possibilities. But if you want one that’s realistic down to individual atoms and electrons, you run out of computing juice pretty quickly.
Machine-learning models can approximate detailed simulations, but often require lots of expensive training data. A new method shows that physicists can lend their expertise to machine-learning algorithms, helping them train on a few small simulations consisting of a few atoms, then predict the behavior of system with hundreds of atoms. In the future, similar techniques might even characterize microchips with billions of atoms, predicting failures before they occur.
The researchers started with simulated units of 16 silicon and germanium atoms, two elements often used to make microchips. They employed high-performance computers to calculate the quantum-mechanical interactions between the atoms’ electrons. Given a certain arrangement of atoms, the simulation generated unit-level characteristics such as its energy bands, the energy levels available to its electrons. But “you realize that there is a big gap between the toy models that we can study using a first-principles approach and realistic structures,” says Sanghamitra Neogi, a physicist at the University of Colorado, Boulder, and the paper’s senior author. Could she and her co-author, Artem Pimachev, bridge the gap using machine learning?
Going up against an algorithm was a battle unlike any other Larkin Seiler had faced.
Because of his cerebral palsy, the 40-year-old, who works at an environmental engineering firm and loves attending sports games of nearly any type, depends on his home care support person for assistance with things most people take for granted, like meals and bathing.
Every morning, Seiler’s support worker lifts him out of bed, positions him in his wheelchair and helps him get dressed for the coming workday. The worker checks back in at lunch time to help with lunch and toileting, then returns again in the evening.
But when Seiler’s home state of Idaho created an automated system – an algorithm – to apportion home care assistance for people with disabilities in 2008, it cut his home care budget in half. He faced being unable to even use the bathroom at reasonable intervals.
Originally from Baltimore, Christine spent two years in Stockholm, most recently as head of data and insights for Universal Music Sweden. She now calls London home working as global marketing director for data and insights at Warner Music Group, where she uses data to identify and accelerate the careers of the next global superstars and co-leads WMG U.K.’s employee resource group, The Link, for BAME/POC employees & allies.
Christine is also the founder of Measure of Music—part conference, part hackathon—to introduce others to the world of music and data.
We caught up with Christine for our Liner Notes series to learn more about her musical tastes and journey through the years, as well as recent work she’s proud of and admired.
Deep neural networks will move past their shortcomings without help from symbolic artificial intelligence, three pioneers of deep learning argue in a paper published in the July issue of the Communications of the ACM journal.
In their paper, Yoshua Bengio, Geoffrey Hinton, and Yann LeCun, recipients of the 2018 Turing Award, explain the current challenges of deep learning and how it differs from learning in humans and animals. They also explore recent advances in the field that might provide blueprints for the future directions for research in deep learning.
Image annotation services becoming more imperative owing to more development in computer vision based AI models for the new fields. Agriculture is one of the most vital field need such cutting-edge technology to improve the crop yield and boost the productivity with less efficiency.
Actually, image annotation is the more precise technique to make the various objects recognizable to machines using the deep learning algorithms. In agriculture field image annotation helps to make crops and other things recognizable to make the right decision without use of humans. So, let’s find out what image annotation can do for agricultural field and how it is utilize in machine learning and AI.
In a Memorandum Opinion and Order (MO&O) released June 17, the FCC denied a Petition for Reconsideration filed by HobbyKing of a $2,861,128 fine for marketing noncompliant RF equipment and for failing to respond to FCC orders in its investigation of the company’s practices. In the same step, the FCC enforced its equipment marketing rules. The fine resulted from an FCC investigation initiated by ARRL’s January 2017 complaint that the HobbyKing equipment was “blatantly illegal at multiple levels.”
The eScience Institute’s Data Science for Social Good program is now accepting applications for student fellows and project leads for the 2021 summer session. Fellows will work with academic researchers, data scientists and public stakeholder groups on data-intensive research projects that will leverage data science approaches to address societal challenges in areas such as public policy, environmental impacts and more. Student applications due 2/15 – learn more and apply here. DSSG is also soliciting project proposals from academic researchers, public agencies, nonprofit entities and industry who are looking for an opportunity to work closely with data science professionals and students on focused, collaborative projects to make better use of their data. Proposal submissions are due 2/22.