Data Science newsletter – July 3, 2018

Data Science Newsletter features journalism, research papers, events, tools/software, and jobs for July 3, 2018

GROUP CURATION: N/A

 
 
Data Science News



J.P. Morgan’s data scientists are about to launch something wild

eFinancialCareers, Sarah Butcher


from

Something is underway at J.P. Morgan. In the coming weeks or months, the bank will be launching a new website to allow anyone to compete in the creation of accurate predictions based on large sets of data.

To be known as RoarData, the site isn’t live yet but the bank plans to launch it in mid-2018 (after delaying the launch from early 2018). J.P. Morgan has been busy hiring for the Roar team, which sits within the data science unit of its corporate and investment bank (CIB). Applicants need to be, “exceptional coders,” who are familiar with key machine learning tools like TensorFlow and chainer. They will work alongside entrepreneurial data scientists who have built and sold products of their own, including Peter Cotton, a J.P. Morgan executive director with a PhD in maths from Stanford who sold his data company to Bloomberg, and Rajesh Tolani, the recently promoted head of data science at the corporate and investment bank, and owner of the Roar product.


5G Beam-Steering Antennas: More Accurate, Less Power Hungry

IEEE Spectrum, Dexter Johnson


from

… researchers in Japan have taken an entirely new approach. They use something called a local oscillator to steer the beams. These local oscillators, in combination with a mixer, can change the frequency of the signal. The scientists, from the Tokyo Institute of Technology, believe that this new approach will lead to large-scale phased-array transceivers capable of increasing communication distance, data rate, and network capacity.


AI Invasion of Wall Street Is Reshaping BofA’s Currency Research

Bloomberg Technology, Ivan Levingston


from

For the team’s first study, Bank of America’s machine-learning algorithms sifted through fundamental and survey data, such as government spending and consumer confidence, to determine how the euro-dollar currency pair might perform. The team used both supervised learning, when the machine receives training in how to process information, and unsupervised learning, when no classification guidelines are given.

The bank’s models concluded that in the aftermath of the Italian election, in which euro-skeptic parties swept into power, the common currency would likely weaken. However, fears of a deep and sustained selloff against the dollar, like the one witnessed during the European debt crisis, were overblown.

Despite all the hype surrounding AI, most banks are still scratching the surface. A vast majority of financial institutions in a Digital Banking Report survey last fall said they used some form of machine learning, but less than 20 percent went beyond “fraud, risk and compliance,” said publisher Jim Marous.


Risk Executives’ New Challenge: An Abundance of Technology

Global Association of Risk Professionals, Katherine Heires


from

Spending on risk management and regulatory compliance functions is on the rise. That’s good news for risk managers at a time when companies in the financial services industry are trying to hold the line on costs. But they are also under pressure to invest the funds wisely in advanced technology that may already be bringing major changes to the way they do their jobs.

“There are expectations at the top for firms to do more with less, and so, even though spending has increased over the last 10 years, firms are now thinking about how to be smarter in the use of technology for regulatory and risk management purposes,” says Cubillas Ding, research director at Celent and author of Risk Management and Compliance 2018: CROs Navigate NextGen Tech.

Ding’s recently published report, along with others from the likes of Accenture, Deloitte, IDC and Protiviti, describes an abundance of applied and emerging technological innovations in such areas as artificial intelligence (AI) and machine learning; blockchain, or distributed ledger technology; cloud computing and open sourcing; and natural language processing (NLP) and robotic process automation (RPA).


Regents approve academic offerings

Brookings Register (SD), South Dakota Board of Regents


from

The South Dakota Board of Regents this week approved new academic program requests from four public universities to meet emerging workforce trends across the state. South Dakota State University’s requests were:

New associate and bachelor degree offerings in data science. The programs will utilize data science-centered mathematics, statistics and statistical computation courses created over the past several years by SD State’s Department of Mathematics and Statistics. Studies indicate high job-growth potential for data scientists trained at the undergraduate and graduate levels. Instruction for these degree programs also will be available online.


IU, Energy Sciences Network receive $3.5M grant to help researchers accelerate big data sharing

Indiana University, News at IU


from

The National Science Foundation has awarded Indiana University and the Energy Sciences Network a three-year, $3.5 million grant to help scientists more efficiently work with massive datasets that have become essential to modern scientific discovery.

The funding will create EPOC: Engagement and Performance Operations Center as a collaborative focal point jointly led by IU International Networks and ESnet, a high-performance network user facility that serves U.S. Department of Energy scientists and their collaborators worldwide. The new center will allow researchers to routinely, reliably and robustly transfer data through a holistic approach to understanding the full pipeline of data movement — and better supporting collaborative science.


AI2 taps University of Washington researcher to lead $125M ‘common sense AI’ initiative

GeekWire, Clare McGrane


from

The Allen Institute for Artificial Intelligence in Seattle is making an ambitious bid to give AI common sense, a major factor in taking the technology beyond its current limitations.

Now the institute has hired a new leader for the project: University of Washington Professor Yejin Choi. Choi will take the helm of Project Alexandria, the “common sense AI” initiative backed by $125 million from Paul Allen, the Microsoft co-founder and founder of the institute.

“Common sense is what makes the fundamental difference between human intelligence and machine intelligence today,” Choi told Geekwire. “Our research will help [enable] AI technologies that are significantly more intelligent and robust than what are practically possible today.”


Feature: AI education booming as China cultivates talent

Xinhua, Guo Ying and Yu Jingjing


from

China’s booming artificial intelligence industry has resulted in a growing demand for talent. To build a strong AI talent pool, China is now fostering AI education in universities by improving the curriculum and promoting interdisciplinary research.

Tsinghua University established its Institute of Artificial Intelligence on Thursday as part of its efforts to advance AI research and education.

Aiming to become a globally influential AI research institution, the institute will focus on the basic theory of AI and actively promote cross-disciplinary AI research as well as the integration of academia and industry.


Joint Artificial Intelligence Center Created Under DoD CIO

Breaking Defense, Sydney J. Freedberg Jr.


from

The Pentagon has created a new Joint Artificial Intelligence Center (JAIC) that will have oversight over almost all service and defense agency AI efforts. This coordination function is crucial to the emerging AI arms race with Russia and China, experts told us.

The JAIC will report to Chief Information Officer Dana Deasy, the establishing memo by Deputy Defense Secretary Patrick Shanahan says. Its ambit is not quite untrammeled; any projects under $15 million remain the authority of the service or agency. The JAIC will establish a common set of AI “standards…. tools, shared data, reusable technology, processes, and expertise” for the whole Defense Department, according to the June 27 memo.


Apple is rebuilding Maps from the ground up

TechCrunch, Matthew Panzarino


from

Apple, it turns out, is aware of this, so it’s re-building the maps part of Maps.

It’s doing this by using first-party data gathered by iPhones with a privacy-first methodology and its own fleet of cars packed with sensors and cameras. The new product will launch in San Francisco and the Bay Area with the next iOS 12 beta and will cover Northern California by fall.

Every version of iOS will get the updated maps eventually, and they will be more responsive to changes in roadways and construction, more visually rich depending on the specific context they’re viewed in and feature more detailed ground cover, foliage, pools, pedestrian pathways and more.

This is nothing less than a full re-set of Maps and it’s been four years in the making, which is when Apple began to develop its new data-gathering systems. Eventually, Apple will no longer rely on third-party data to provide the basis for its maps, which has been one of its major pitfalls from the beginning.


Identifying Future Victims of Climate Change

The Scientist Magazine®, Catherine Offord


from

In late 2014, conservationist Ian Gynther lost hope. After days spent crawling into rock crevices, scouring through camera-trap footage, and carefully laying bait around Bramble Cay—a tiny island at the northern end of Australia’s Great Barrier Reef—there was little room for doubt. The Bramble Cay melomys (Melomys rubicola), a furry little rodent endemic to the island, had gone extinct. “My colleagues and I were devastated,” Gynther, a senior conservation officer at Queensland’s Department of Environment and Heritage Protection, later told The Guardian. “As each day of our comprehensive survey passed without revealing any trace of the animal, we became more and more depressed.”

The disappearance of the Bramble Cay melomys became a grim milestone in the history of conservation biology. Its extinction report, published in 2016, determined the cause of death to be anthropogenic climate change, the first such attribution for a mammalian species.1 The rodents’ home had been battered by increasingly extreme weather, storm surges, and rising sea levels, Gynther and his colleagues wrote in the report, pointing “to human-induced climate change being the root cause.”

The melomys will not be the last species to meet this fate. As global temperatures rise, more and more of the Earth’s millions of species are experiencing environmental change at a rate that may well be unprecedented in our planet’s history. A recent meta-analysis of research on more than 2,000 species suggested that nearly 50 percent of threatened, nonflying terrestrial mammals and 23 percent of threatened birds had already been negatively affected by climate change in at least part of their ranges.2 And with climate change accelerating many deleterious global dynamics, such as ice melt and ocean acidification, the damage is likely to continue.


Researchers apply computing power to track the spread of cancer

Princeton University, School of Engineering and Applied Science


from

Princeton researchers have developed a new computational method that increases the ability to track the spread of cancer cells from one part of the body to another.

This migration of cells can lead to metastatic disease, which causes about 90 percent of cancer deaths from solid tumors — masses of cells that grow in organs such as the breast, prostate or colon. Understanding the drivers of metastasis could lead to new treatments aimed at blocking the process of cancer spreading through the body.

“Are there specific changes, or mutations, within these cells that allow them to migrate?” asked Ben Raphael, a professor of computer science at Princeton and the senior author of the new research. “This has been one of the big mysteries.”

 
Deadlines



1st International Workshop on Energy Efficient Data Mining and Knowledge Discovery

Dublin, Ireland September 10-14, co-located with ECML PKDD 2018. Deadline for workshop paper submissions is July 12.
 
Tools & Resources



Introducing the Ground Control Point interface

Stamen Design, Eric Brelsford


from

“OpenDroneMap is an open source toolkit that can help you do that. It takes aerial imagery such as that created with your drone and turns it into several other types of data, such as point clouds, meshes, and orthophotos.”


Interacting with AWS from R

R-bloggers, Digital Age Economist


from

“With the cloudyr project it makes R a lot better at interacting with cloud based computing infrastructure. With this in mind, I have been playing with the aws.ec2 package which is a simple client package for the Amazon Web Services (‘AWS’) Elastic Cloud Compute EC2API. There is some irritating setup that has to be done, so if you want to use this package, you need to follow the instructions on the github page to create AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_DEFAULT_REGION parameters in the ENV. But once you have figured out this step, the fun starts.”

Leave a Comment

Your email address will not be published.