Job Description

**Crawler Engineer**
Job summary:
This role focuses on enhancing and mastering Lex Machina's crawling infrastructure to support our customer-facing products. You will be responsible for designing and maintaining robust automated data collection systems in a challenging and large-scale environment. Your work will play a critical role in driving the success of our products, making this an impactful and rewarding opportunity.
**Responsibilities**:
- Build, maintain, and optimize automated data collection and storage systems, ensuring reliability and scalability.
- Extract and process large datasets from industries such as retail pricing, neobanks, and airlines or OTAs.
- Debug, troubleshoot, and optimize crawlers for maximum performance and accuracy.
- Collaborate with cross-functional teams to align data collection processes with business goals.
- Create and maintain comprehensive documentation of crawling infrastructure and processes.
- Communicate effectively...

Ready to Apply?

Take the next step in your AI career. Submit your application to NET2SOURCE INC GLOBAL today.

Submit Application