Find Remote Jobs Worldwide & Work From Anywhere

Personal Capital company's all remote jobs

Skills: java amazon-web-services spring-boot kubernetes mysql

About Us: Personal Capital, an Empower Company, is a remote-delivery, industry-leading digital wealth management company that helps people transform their financial lives through technology and advisory services. In 2020, Personal Capital joined Empower Retirement, the nation’s second-largest retirement services provider, to continue the mission to disrupt the traditional $30 trillion U.S. wealth management market. Our state-of-the-art tools and technology provide investors with a complete financial picture, utilized by over 2.5 million customers. Our registered investment advisors offer expert guidance and customized strategies based on a personal understanding of an investor’s financial picture and goals. Personal Capital currently manages over $14 billion in assets, virtually delivered from offices across the U.S

The Opportunity: You will work on data that will amaze you, work with a team that will inspire you, and help create products that truly add value to our users. Personal Capital’s server team is extremely agile and native of the cloud and you will have the chance to to be an influencing member of this team, not just on Personal Capital, but also on the financial lives of our users. We keep our feet on the ground and heads in the cloud.

The Candidate: We are looking for a very curious Senior Java/AWS Engineer. You need to be a thinker and a doer. You need to be smart and build smart products. You need to be ambitious. This is not an easy job: you will need to wear multiple hats, work with many unknowns, travel many unpaved roads to tackle large-scale problems. And you need to have a cloud native app mentality. This will be your finest work and creation, and an amazing engineering team is there to collaborate with you and support you.

**Please note that although this specific post is for a senior engineer, our team has openings across various levels - mid-level software engineer, senior engineer, principal engineer, and engineering manger.

The Work: As a server side engineer you will work on many initiatives, integrations, innovations, while always trying to scale it more, improve performance and keep the system secure.

  • Design, implement, deploy and support new web-based applications and web services using server technologies stacks that include Java, MySQL and AWS services.
  • Define and implement REST Web Services for exposing personal financial services for managing financial accounts, payments, investment portfolios, credit products
  • Execute end-to-end software development and deployment in an agile environment that greatly values CI/CD.
  • Define and develop cloud-native distributed systems and container orchestration platforms using micro service and serverless architectures.

Desired Skills and Experience:

  • 4+ years of professional software engineering experience
  • Expertise with standard server-side software technologies, including: Java, MySQL, Spring, Hibernate, etc.
  • Experience with REST and API design
  • Experience with AWS Cloud Computing infrastructure and APIs is a big plus.
  • Solid practitioner’s understanding of secure application development best practices, such as OWASP
  • Deployment of scalable web systems with architectural requirements for high availability, high throughput, low response time, etc.
  • Experience in Object Oriented Analysis and Design (OOA/OOD) with appropriate use of design patterns.
  • Desire to grow professionally with a small, focused, development team in a fast-paced team.
  • Excellent communication and problem solving skills.
  • Bachelor's degree or higher, in Computer Science or related field

Personal Capital is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.



Skills: python sql pyspark amazon-web-services etl

The Job:

Senior Data Engineers play a critical role on the Data and Analytics team, responsible for transforming data from disparate systems to provide insights and analytics for business stakeholders. The Data and Analytics team leverages cloud-based infrastructure to implement technology solutions that are scalable, resilient, and efficient. Collaborate with other Data Engineers, Data Analysts, Data Scientists, DBAs, cross-functional teams, and business partners.

The position offers the opportunity to help architect, design, implement and operate data engineering solutions - using Agile methodology - that empower users to make informed business decisions. The ideal candidate is self-motivated, self-directed, and has hands-on experience with all aspects of the software development lifecycle, from design to deployment.

The Candidate:

Our team is searching for someone with a deep understanding of the full life data lifecycle and of the role that high-quality data plays across applications, machine learning, business analytics, and reporting. Strong candidates will exhibit solid critical thinking skills, the ability to synthesize complex problems, and a talent for transforming data to create solutions that add value to a myriad of business requirements.

Ideal candidates will have the demonstrated ability to facilitate and take ownership of assigned technical projects in a fast-paced environment. Excellent written and speaking communication skills are required as we work together in a collaborative cross-functional environment and interact with the full spectrum of business divisions.

Qualifications:

  • Bachelor of Science degree in Computer Science or equivalent.
  • At least 7 years of post-degree professional experience, including:
  • 4+ years development experience building and maintaining ETL pipelines
  • 3+ years of Python development experience.
  • Experience with AWS integrations such as Kinesis, Firehose, Aurora Unload, Redshift, Spectrum, Elastic Mapreduce, SageMaker and Lambda.
  • Experience in mentoring junior team members through code reviews and recommending adherence to best practices.
  • Deep understanding of writing test cases to ensure data quality, reliability and high level of confidence.
  • Track record of advancing new technologies to improve data quality and reliability.
  • Continuously improve quality, efficiency, and scalability of data pipelines.
  • Expert skills working with SQL queries, including performance tuning, utilizing indexes, and materialized views to improve query performance.
  • Advanced knowledge of both OLTP and OLAP environments with successful implementation of efficient design concepts. Proficiency with the design and execution of NoSQL database to optimize Big Data storage and retrieval.
  • Experience with API code integrations with external vendors to push/pull data between organizations
  • Familiarity with data orchestration pipeline using Argo or Airflow. Knowledge of analytic tools such as R, Tableau, Plotly, Python Pandas.
  • Financial services industry experience is a plus


Time zone: (GMT-08:00) Pacific Time