The use of new technology broaden the scope of economics analysis, and allow the practitioner to elaborate different strategies.
Therefore, I had in mind to do a presentation in order to show how we can leverage technology to get the best of the data. The presentation is not meant to learn how to code, instead, to see the possibilities with the new technologies.
In the presentation, you will see:
How to use an API to get more and accurate information: example from How to use Deep Learning to create new features: How to convert text into How to extract verbs, nouns and adjectives in a document: How to leverage state-of-the-art to predict sentiment from a document How to convert in a PDF into a dataframe
There is no prior requirement. The basic idea behind the presentation is to see somehow what can be done using new tools and how an economist can borrow techniques from the data science world.
if you are interested to participate in the presentation, fill the information below
5 participants minimum to open the presentation
Current number of participants: if you have any question, feel free to contact me: To share the document, use this link:
The presentation will use a recent research I’m doing about “Environmental, Social, and Governance”
The workflow is the following:
Download data and predict gender Compute sentiment and cluster papers using the abstract Compute ESG expertise score I wrote a post on Medium about the data pipeline. The post is available here: and the Jupyter Notebook is available in this URL: To see my data science project portfolio, check this link: