Feb 19, 2019
Data Architect (Beam Solutions)
Job Title
Company
Location
Description
Job description:
Beam is disrupting a $7.5 billion compliance software market. Beam's scalable technologies incorporate new data sources for our bank and fintech partners, helping make the financial system safer while maximizing the value and utility of critical compliance resources. As an engineer at Beam, you will be building large scale, comprehensive, elegant, and easy to operate compliance solution for fintechs, large banks, small banks, credit unions, and broker-dealers.
Beam is a rapidly growing venture-backed startup led by industry veterans. With a strong product vision and an experienced development team, our product has been well-received by customers and we're excited to grow. As a member of the engineering team at Beam you will be a critical asset responsible for many tasks and systems that span our entire product and internal infrastructure. As part of the R&D team you will be part of shaping the technological landscape, build new products, add and improve features, scale and maintain the systems that keep our customers happy. You will also build internal tools and systems to help streamline and optimize internal initiatives.We are looking for seasoned Software Engineers with experience building large scale, distributed, and service-oriented architectures. You feel comfortable switching hats between software development, business requirements gathering, devops tasks, and working with our machine learning experts. You take pride in owning and maintaining the technical work environment. You enjoy learning new technologies and sharing your code with others. Lastly, you have been looking for an opportunity where you can use your passion and skills for a greater good against financial crime.
Our main office is located in the financial district of San Francisco and we offer a generous full range of benefits.
Responsibilities:
- Collaborate with the R&D team to architect and build a distributed cloud-based SaaS solution for transaction monitoring.
- Build highly scalable and distributed data pipelines that can process our large customer data sets in real-time.
- Champion Beam's Data Lake and Beam Intelligence initiatives.
- Permanently identify efficiency opportunities to scale our product and the team behind it.
- Actively participate in production discussion to help building cutting-edge solutions.
- Explore, evaluate and integrate new data pipelines.
Skills And Experience:
- 4+ years of experience, or 2+ years plus a relevant Master's degree.
- BS in Computer Science or a related field.
- Proficient in Python or Java and related machine learning and ETL frameworks.
- Strong understanding of building data pipelines and data lakes.
- Expert in working with large scale data processing using Hadoop stack.
- Deeply familiar with messaging systems and data pipelines such as RabbitMQ and Kafka.
- Solid understanding of NoSQL database modeling and design.
- Scaling knowledge of Cassandra, Elasticsearch, MongoDB, Neo4j, HDFS.
- Familiar with cloud infrastructures such as AWS.
- Hands on knowledge working with social media data sources.
- Strive in a fast-paced environment.
- Business driven and result-oriented individual.
- Obsessive in delivering quality software.
- Passionate about open source.
- Relentless in keeping solutions secure.
- Effective communication & collaboration skills.
Extra credit:
- Experience with containerization related management stacks like Docker, Swarm, Kubernetes.
- Worked in a financial services context related to risk management.
- Experience in blockchain technology and crypto-currency.
- Open source contributor.
How to Apply
Interested students can email Kevin Feng, Head of Data Science at Beam Solutions, at kevin@beamsolutions.com.