About
My background is in Data Analytics where I provide services to clients for all their back-end data needs. I use tools, such as MS Excel, SQL, MS Access, and Python, to clean the data and point out any anomalies.
Once the data is cleaned, it is then transformed into a visualization with the use of PowerBI or Tableau to convey the message in a meaningful way. On a more personal note, I received my Bachelor of Sciences degree in
Mathematics and pursued a minor in Business Foundations. While studying in my final year of university, I also studied Data Analytics at the prestigious Rice University.
I currently work as an Associate for PricewaterhouseCoopers (PwC) where my talents lie in Advisory Services, with the Deals Technology and Data Solutions team. My primary role is to
manage client data and use the information to consult new business solutions to generate value for their company.
Arian Sadri the Analyst.
Basic information of myself.
- Birthday: February 20th, 1996
- Website: ariansadri.com
- Phone: 713-820-8622
- City: Houston, Texas
- Age: 26
- Degree: Mathematics and Business Foundations
- Email Address: sadri_arian@yahoo.com
- Freelance: Not Available
Skills
My Data Analytics skills with a given measurement bar based from experience and knowledge of the subject.
Resume
An in depth view at my past experiences and capabilities.
Projects
The Coronavirus Impact
This project was to show the impact of the Coronavirus disease in the United States of America. The data was pulled directly from the The Centers for Disease Control and Prevention database. Then, with a bit of data cleaning and manipulation, I displayed the spread of the virus per state and per county using GeoMapping. Please feel free to hover your curser over the image and select the "+" to see the map in a larger view.
Americas Most Wanted
Americas most wanted was a project designed to solve questions or clear misconceptions people had about crime rates. I would use this project to prove or disprove the many misconceptions people had about crimes in the United States. Please use the link attatched from each image at the bottom to go directly to the project and see all of the Hypotheses i was testing, all of the maps/ charts for each hypothesis, and the final conclusions. Note: All of this data came directly from FBI, BLS, and the Census Bureau.
Everything Everywhere Machine Learning
The goal of this project was to solve the common problem of trying to decide on a place to eat. This problem continues to haunt people all over the world. The user will initially open the website, then scroll to the bottom where they would enter information for "Price Range", "Rating", and "Popularity". Once completed, they would click "Submit". The machine learning algorithm would then use the inputted information to pick a food category that comes directly from YELP data. For example, if the user got "Mexican", they could then input their address above with a given radius. The map would then display your location with all restaurants from the given category with the radius that the user inputted. There is always an option for clicking on the button "My Location", which would give a default radius of two miles. If the user is concerned about transportation, they could click the buttons under the map to display live traffic data from a Google API. *Please note: This webiste server is from the cloud platform Heroku, which is essentially free to use. That is why it takes some seconds for the website to load.*
Also, please feel free to check out my many other projects on GitHub. There is a direct link to my page using the second profile circle under my main picture.
Services
Data analysis is never one step, there are always multiple processes that need to be done to achieve the end result. When the correct steps are taken with the proper tools, the reach with data is endless. We can explain the past, present, and the future.
Data Extraction
The first prcoess is to extract the data from a client file manager or a data warehouse. Once the data is present, it can then be sent to the cleaning process.
Data Clean Up
Sometimes you are given a luxury when you receive a datset containing no errors. When you are not presented with such luxury, the dataset must always be cleaned before it can be moved to the next step in the process.
Data Manipulation
Once the data set has been fully cleaned and there are no more errors present. This is where you can begin to search for the information in your data set. You would then manipulate your data set with the use of queries to narrow it down in order to answer the questions that are present.
Data Visualization
Once the data has been queried, we can now display the data set into some sort of chart or graph to report your findings. Data does not mean anything if it cannot be conveyed in a way to answer some sort of question. Hence, graphing your clean and narrowed down data is the best way to speak to your audience.