Hybrid

LLM Lead Engineer

Location: USA / Job Type: Full Time

Please email jobs@purgo.ai

About the Role


We are seeking a Lead Engineer with expertise in Large Language Models (LLMs) to drive our generative AI application development. The role requires the use of advanced AI technologies, such as LLM Serving Frameworks and Retrieval-Augmented Generation (RAG), to innovate and improve AI applications for various purposes. The ideal candidate should have practical experience in fine-tuning AI models, utilizing the OpenAI API, and employing open-source LLMs to craft cutting-edge applications. Proficiency with LLM development frameworks, particularly Langchain, is essential. The primary goal is to develop and enhance our generative AI capabilities to tackle complex software engineering challenges.


Requirements


Responsibilities


  • • Design, implement, and maintain LLM Serving Platforms to support the deployment and scaling of large language models.
  • • Work with Retrieval-Augmented Generation (RAG) to improve the information retrieval capabilities of our models, enhancing their accuracy and efficiency.
  • • Conduct fine-tuning of LLMs using diverse datasets to tailor models for specific applications and ensure optimal performance.
  • • Utilize API from popular development platforms like Jira, Github, Snowflake, Databricks, Datadog to build Gen AI tool for software development.
  • • Utilize the OpenAI API and open-source LLMs in building Gen AI Apps.
  • • Leverage Langchain or similar LLM Application development frameworks for efficient AI model development and deployment.
  • • Utilize API from popular development platforms like Jira, Github, Snowflake, Databricks, Datadog to build Gen AI tool for software development.

About you


  • • Your expertise in Gen AI and LLM Applications is both profound and practical.
  • • You have a rich background working with cloud data platforms (e.g., Snowflake, Databricks, AWS Redshift, Google BigQuery, Azure), DevOps tools (e.g., Atlassian), cloud SaaS solutions (e.g., Salesforce), and contemporary testing tools/frameworks.
  • • You are well-versed in the application interfaces of cloud data platforms and observability tools (e.g., Datadog, Splunk).
  • • You have hands-on experience in LLM Prompt Engineering, including techniques such as Retrieval Augmented Generation, Self-Consistency, and Prompt Chaining.
  • • You are expert in using APIs from OpenAI, Atlassian, Github, and other prominent development platforms.


About the Company


Purgo AI turns the delivery of cloud-native and data applications faultless through generative AI powered requirements engineering, functional testing, query testing and data integrity. The product generates comprehensive behavior driven (BDD) requirements from its fine-tuned large language models (LLMs) based on sparse feature request tickets and compiles detailed function definitions, performance requirements, metrics and security/compliance requirements. It also reasons to identify gaps in requirements and generates detailed questionnaires for product teams to fill the gaps. These generated requirements are also rendered in an LLM-friendly format to trigger AI-driven code generation.The product proactively generates functional tests for the application’s interfaces, queries in the API payload and data received back in response. This is done ahead of code development for driving definition and quality assurance across both AI-driven code-get and human-led development. The tests are applied to the developed software and bug reports are generated for failures identified. After the software is pushed to production, Purgo AI causally connects the log data from the application all the way back to the requirements and automates the triage, analysis and response to escalation tickets.

Purgo AI is engaging with early customers and design partners around generative AI driven requirements & test automation for fault-sensitive cloud-native data apps across use-cases like business intelligence, forecasting and machine learning over data lakes like Snowflake and Databricks. The product integrates seamlessly with Jira, Github/code co-pilots, testing platforms (like Selenium, Pytest etc.) and data warehouse APIs. Customers begin using Purgo AI for capturing behavior-driven requirements, functional testing, data integrity and data app observability. The company’s co-founder and CTO, Sang Kim, has been an engineering leader across VMWare and Blackberry. The company is based in Palo Alto, CA and co-created by The Hive, a venture studio focused on data and AI in the enterprise.

Please email jobs@purgo.ai.

Purgo AI is an affirmative action employer and welcomes candidates who will contribute to the diversity of the company.