Case Study AI / ML Project – MAYA

Features:

Maya is set to read the RSS feed of SEC Edgar portal on daily basis and fetch the data of companies who have filed their proxies recently.
Maya can locate and fetch Summary compensation table from the proxy filings of the companies declared on SEC Portal.
Maya can segregate data for different officers from its Summary Compensation table.
Maya can locate Exercised and Vested Table from DEF14 filing.
Maya can locate Equity outstanding table from DEF14 filings even if it is a multi-page table.
Maya can locate Plan-Based awards table from DEF14 filings.
It can identify names of officers from a paragraph or a table.
It can intelligently find variation in Officer names nomenclature using past years data from our Databases.

Case Study – MAYA

MAYA MAYA was developed with an objective to migrate from a manual data entry process to a system which has intelligent agents which can locate the target data from proxy filings of the companies, download that data and load it to our systems for further analytics.

MAYA is developed as an automation engine to fetch different kinds of data from proxy filings on Securities and Exchange SEC portal. It is an agent which uses multiple technologies like Python, web-scraping, artificial intelligence, SQL, data processing libraries like Pandas and few other technologies to intelligently identify and extract the target data from a paragraph or a table.

MAYA model is currently trained to extract Summary compensation table, Outstanding equity components, Exercised and Vested section and Plan-Based awards from the proxy filing of a company. This agent checks the RSS feed of SEC on daily basis and checks if those companies are already present in our Database, and if a company is found to be present in our database, it then goes on to check if that company already has any data present in our Databases for that fiscal year, if not, it then adds that fiscal year against that respective company to our databases.

MAYA then goes on to the respective proxy filing page and locates all the target data. It then downloads all that data, cleans that data, and then finally inserts that data to our Databases. This data can then be viewed through an internal portal (CLIC) for further downstream operations/analytics.