Work

Experience

Introductory

In this section, I will be discussing my work experience. I always found ways to make the most of whatever opportunity that came my way and worked so that my contributions were greatly appreciated no matter where I found myself. Moreover, I was able to get the experience I needed to thrive in a professional setting. I will be discussing which of my contributions had the biggest impact, what skills I developed, and how I was molded by the tasks I was given and the people I worked with. I will also provide contact information for the people who agreed to be my references.

Below are quick summaries of each of my internships. Please click the buttons on the right to see more details.

Content Data Specialist Intern @ Tesla Government

My primary task was using the pandas library in Python to clean and reformat data to be uploaded to the company site and scraping data from websites to put into structured data tables. I also competed in the company hack-a-thon with a natural language model for assigning topic tags, and entered a data visualization competition with a bubble plot made in RStudio.

Intern @ MacDowell Law Group

My main contribution to MacDowell Law Group was using my knowledge in Python and natural language processing to clean and reformat disorganized datasets. I also used Python to compare multiple datasets and highlight rows that were different. I had the opportunity to act as a consultant for Mr.MacDowell concerning a 3rd party software vendor and their AI chatbot designed for websites.

Resume download

You can download my resume here: My Resume

About

Start Date: June 1st, 2023
End Date: August 25th, 2023

Tesla Government is a contractor company for federal government agencies headquartered in Falls Church, VA. I was hired for my skills in Python and my Korean fluency. This was my very first internship and my first step into writing code in a professional workplace environment, and it couldn't have been a better first step. I was surrounded by people who made it clear that they appreciated my work and had a kind and patient supervisor who explained my tasks concisely. The company even provided opportunities to learn and grow via friendly competition. For many of my tasks, I was working alongside a recent graduate from Emory University. He was relaxed and easygoing but also very talented. We worked very well together and I learned a lot from him in the process. Overall, this was a wonderful internship and I was lucky to be able to develop my professional skills in such an encouraging environment.

What I did

Information Confidentiality Training

During my onboarding, I was given confidentiality training and how it pertained to company data. The training included proper personally identifiable information (PII) handling and detailed steps to secure a work environment to ensure no data is left exposed.

Maritime Data Filtering and Cleaning with Python

This was one of my major ongoing tasks throughout the internship. Data on various maritime activities needed to be cleaned and reformatted to be uploaded to the company's website and be displayed on maps. My partner and I were regularly given datasets over the course of the internship to clean. This involved heavy usage of the Python pandas library and string manipulation with the occasional regular expression. Each dataset took us 2-3 days except for one particularly long and jumbled one that took us a full work week. The recipients made it clear that they were amazed by how fast we made progress.

Company Hack-a-thon

Participating in the company's hack-a-thon is probably my favorite memory of this internship. In fact, I consider this my magnum opus of my time there. I teamed up with my Maritime Data partner as well as another intern from my onboarding group and a full-time employee to compete. Our submission was a natural language model that was able to read foreign relations articles and assign proper topic tags that indicated the countries involved, what type of interaction it was (military, humanitarian, etc), and whether it was more friendly or hostile. We earned 2nd place and a different team of the company even expressed interest in using our model for their future work.

Data Visualization Contest

In addition to the hack-a-thon, I was also given the opportunity to participate in a data visualization contest to present African foreign relations data. I partnered with my Maritime Data friend and used RStudio for our visualization. Our visualization featured several bubbles on a map of Africa with color and size representing the country involved and the age/recency of the interaction. We didn't come first, but we did get a very close second.

Web Scraping

For this task, I was able to use both my language skills and Python coding expertise. I was asked to find data on various North Korean missile launches and scrape the approved sources afterwards. And what better place to find information on North Korean missile launches than from their Southern neighbor who has to deal with all of it? Using a VPN, I was able to not just look at English sources but also sources directly from South Korean news outlets as well. From the scraped data, I extracted the missile's date of launch, distance covered, landing location, and model name and entered them into datasets.

Data Entry

When I wasn't working on a task that involved coding, I was working on simple data entry on the company's information-sharing site. It certainly wasn't the most fun part of work; in fact, I dreaded it. But I like to think that even this was a learning experience that molded me into a more diligent and tenacious individual.

What I learned

References

About

Start Date: August 5th
End Date: September 10th

This was my second internship that lasted a much shorter time than my previous internship. Admittedly, this was less of an internship and more of helping out a family friend. Long story short, my family decided to visit and old family friend for the first time in almost a decade, the friend said she needed help at her office, and I became an intern that day.

For the next two weeks, I worked at this family friend's organization: MacDowell Law Group, PC, which also happened to be my own family's go-to lawyer for personal injury cases. Mr. Richard MacDowell is a renowned personal injury attorney in Fairfax, VA, and as such has accumulated data on thousands of clients over the years, my father included. However, the firm made the mistake of putting different interns to work on the same datasets. The result was the datasets becoming very disorganized as each intern had their own way of entering the data. MacDowell Law Group decided to use Clio, a 3rd party software designed for law firms, and they realized just how disorganized the data was after Clio requested their data be sent according to a specific format. For months, many of the employees had been staying at the office overnight several times a week trying to reorganize multiple jumbled datasets, and it was looking like it would take years to finish. That was until I was brought in, and cleaned and formatted an entire dataset in a day with Python. As soon as everyone realized the magic that is Python, I was called in just about every day for 2 weeks to clean the rest of their data until I had to go back to campus. I think I may have received enough praise in those 2 weeks to feed my confidence for 2 lifetimes.

What I did

Data Cleaning with Python

This was my primary task and main contribution to the law firm. Datasets made increasingly disorganized over the years by multiple interns working on them now needed to be reorganized. My knowledge in regular expressions and natural language processing was immensely helpful here. I read each entry in each row and successfully extracted the important bits like the fee type and amount and client's company name. There were a few edge cases that my regular expressions failed to pick up, but there were never more than 10 of them and could be easily dealt with by just explicitly setting their values.

Spreadsheet Comparison with Python

Some time before my arrival, the firm experienced a power outage that caused one of their datasets to lose unsaved changes. I was asked to compare two datasets, one before the outage and one after, and highlight any differences. I used the Python pandas package to read the excel files and then compare them using the case number as the primary key.

Consultant concerning potential onboarding of 3rd party vendor

At one point in the internship, a representative of a 3rd party solutions provider came in to suggest their product for the firm's website. Their product was an AI Chatbot that would increase visitor interaction and retention. As the new "tech guy", I was called in by Mr.MacDowell to listen to the representative and help him come to a decision. I first asked if AI was truly necessary for the company website. It would help with redirecting clients to specific resources based on their requests, but that could also be done by much simpler program. I cautioned Mr.MacDowell that AI is often used as a buzz word for quality since recently when it is often not needed. I then questioned the legitimacy of the AI chatbot as I had a few prior experiences with a few self-proclaimed "AI" chatbots that required users to pick from a set list of options instead of allowing them to type freely. This time, it was indeed real AI. Of course, I would make it clear if I didn't know something rather than try to make something up. In the end, no decision was made, but it was still an interesting experience.

What I learned

References