NYU Data Science newsletter – January 8, 2016

NYU Data Science Newsletter features journalism, research papers, events, tools/software, and jobs for January 8, 2016

GROUP CURATION: N/A

 
Data Science News



The physics of life

Nature News & Comment


from January 05, 2016

From flocking birds to swarming molecules, physicists are seeking to understand ‘active matter’ — and looking for a fundamental theory of the living world.

 

How Economics Went From Theory to Data

Bloomberg View, Justin Fox


from January 06, 2016

One of the most striking things about attending the annual meeting of the American Economic Association after a long absence is that economics is now really all about the data. Older theorists such as Eric Maskin (who won the 2007 economics Nobel), Jean Tirole (the 2014 Nobelist) and Bengt Holmstrom were still accorded prominent roles as luncheon speakers at this week’s gathering in San Francisco, but in the sessions where actual research was being presented most of the activity and excitement surrounded empirical work.

 

Exclusive Interview: Gideon Mann, Head of Data Science, Bloomberg

AI Business


from January 06, 2016

AIBusiness.org has secured an exclusive interview with Gideon Mann, Head of Data Science for Bloomberg L.P. Gideon shares how one of the world’s leading media brands is planning on building hybrid systems that augment human ability. AI will help Bloomberg deliver a significant edge to clients for being first, better informed and from taking risks.

 

It’s Harder Than It Looks To Link Inequality With Global Turmoil | FiveThirtyEight

FiveThirtyEight, Jay Ulfelder


from January 07, 2016

For thousands of years, close observers of politics have claimed that economic inequality causes political turmoil. In 350 B.C., Aristotle identified inequality as a principal driver of revolution and state collapse. In 19th-century Europe, Karl Marx concurred (and, in a fashion, applauded). In 2011, as the Arab Spring unfolded, Kenyan journalist John Githongo claimed in The New York Times that “radical and growing economic inequality animated much of what was at stake in the various Arab uprisings, and it will play a major role in shaping African politics for years to come.” And in a recent column for Le Monde, economist Thomas Piketty argued that “terrorism feeds on the Middle Eastern powder keg of inequality.”

Just because a belief is widely held, however, does not make it true. In fact, it’s still hard to establish with confidence whether and how economic inequality shapes political turmoil around the world.

 

Academics, we need to talk.

Matt Welsh, Volatile and Decentralized blog


from January 06, 2016

Although I made the move to industry a bit more than five years ago, I still serve on program committees and review articles for journals and the like. So it’s painful for me to see some of my academic colleagues totally botch it when it comes to doing industry-relevant research. Profs, grad students: we need to talk.

 

Chicago Is Using Data to Predict Food Safety Violations. Why Aren’t Other Cities?

CityLab, Julian Spector


from January 07, 2016

The three dozen inspectors at the Chicago Department of Public Health scrutinize 16,000 eating establishments to protect diners from gut-bombing food sickness. Some of those pose more of a health risk than others; approximately 15 percent of inspections catch a critical violation.

For years, Chicago, like most every city in the U.S., scheduled these inspections by going down the complete list of food vendors and making sure they all had a visit in the mandated timeframe. That process ensured that everyone got inspected, but not that the most likely health code violators got inspected first. And speed matters in this case. Every day that unsanitary vendors serve food is a new chance for diners to get violently ill, paying in time, pain, and medical expenses.

That’s why, in 2014, Chicago’s Department of Innovation and Technology started sifting through publicly available city data and built an algorithm to predict which restaurants were most likely to be in violation of health codes, based on the characteristics of previously recorded violations.

 
Events



UAB TED Prize Winner, Sarah Parcak, on Colbert’s Show Friday



UAB “Space Archeologist and 2015 TED Prize winner, Dr. Sarah Parcak will be a guest on The Steven Colbert Show this Friday, January 8, 2016. Check your local listings for show times.

Dr. Parcak is known for using advanced algorithms to pinpoint historical sites using satellite imagery. This allows researchers to investigate the sites without disturbing the ruins. Her research has also been instrumental in identifying historical areas that are being impacted by looters.

 

Virtual Assistant Summit



AI, machine learning, speech recognition and NLP technologies are converging to allow creation of Intelligent Virtual Assistants, explored in the first ever Virtual Assistant Summit. Also, at the same time and place: Deep Learning Summit.

Thursday-Friday, January 28-29, in San Francisco. Use code KDNUGGETS to save 20% off.

 

Personalizationpalooza



Building upon the success of Personalizationpalooza 2015, this event will be a ½ day program featuring a dozen speakers delivering lightning talks, demonstrations and discussion focused on data science and the opportunity for increasingly personalized experiences across the media/tech industry. You will hear from startup founders and corporate data scientists who are focusing on data-driven products and services, and from faculty experts in the fields of big data and computer science.

Thursday, February 25, starting at 8 a.m., at Joseph Urban Theater at the Hearst Tower (300 West 57th Street)

 
Deadlines



Social Media and Demographic Research: Applications and Implications

deadline: subsection?

This Research Workshop is co-organized by the IUSSP Scientific Panel on Big Data and Population Processes as a side event at the 10th International AAAI Conference on Web and Social Media (ICWSM-16), taking place in Cologne, Germany, 17-20 May 2016.

Deadline for submissions is Friday, March 18.

 
Tools & Resources



How to Prevent Bad Science

Galvanize, Benjamin Skrainka


from January 05, 2016

Bad science can damage your reputation—as Pons and Fleischmann discovered—and costs companies real money, such as faulty forecasts leading to hundred million dollar capacity planning errors or incorrect analysis causing marketers to advertise in ineffective channels. Verification, Validation, and Uncertainty Quantification (VVUQ) is a framework from the Modeling and Simulation community which can be used to think about the correctness of scientific models. Using this framework will help you have more confidence in your results and avoid costly blunders.

 

Pythor-Python meets R

GitHub, Nipun Batra |


from January 07, 2016

I typically use Python for my data analysis. However, there are many packages which exist only in R. RPy2 allows interaction between R and Python. The following post is an attempt to show how we can leverage the awesomeness of R’s packages in Python. Python meets R-Pythor. The following is the Github link to the project: https://github.com/nipunbatra/pythor

 

Your Neighborhood’s Crime Rank – Insights from the NYPD’s Most Detailed Data Release Ever

I Quant NY


from January 04, 2016

… something big quietly happened last week; something that starts to put some of the issues I just outlined to bed. For the first time in its history, the New York Police Department released incident-level reported crime data. That means for each individual reported felony (for the first three quarters of the year), the citizens of New York now have access to data showing where and when that felony was reported instead of a single number telling us the total number that were reported in a precinct over a period of time. There are countless insights to derive within, and with this post I am going to start to scratch the surface.

 

Machine Learning is Fun! Part 2

Medium, Adam Geitgey


from January 03, 2016

In Part 1, we said that Machine Learning is using generic algorithms to tell you something interesting about your data without writing any code specific to the problem you are solving. (If you haven’t already read part 1, read it now!).

This time, we are going to see one of these generic algorithms do something really cool?—?create video game levels that look like they were made by humans. We’ll build a neural network, feed it existing Super Mario levels and watch new ones pop out!

 

Unearthing Data to Unleash Impact: Using Unique Data Sources to Drive Change

yhat blog; Nick Eng and Neal Myrick


from January 06, 2016

Using data for good is a journey – from finding the best data, structuring it to unleash its potential, and effectively communicating the results, the pathway is always different and never dull. We’re excited to share three of our favorite journeys below that not only show data is all around us – but that it can also be a tremendous force for good.

 

Leave a Comment

Your email address will not be published.