¡Activa las notificaciones laborales por email!
Join Tether as part of the AI model team, where you will innovate and refine fine-tuning methodologies for advanced models. Your role will involve curating data, optimizing performance, and deploying models in a dynamic environment. The ideal candidate possesses expertise in large language models and a strong background in computer science or related fields, aiming to contribute to a groundbreaking platform in digital finance.
Join Tether and Shape the Future of Digital Finance
At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our solutions enable seamless integration of reserve-backed tokens across blockchains, empowering businesses worldwide with secure, instant, and cost-effective digital transactions. Transparency and trust are fundamental to our mission.
Innovate with Tether
Tether Finance: Our product suite includes the trusted stablecoin USDT and digital asset tokenization services.
Tether Power: We promote sustainable Bitcoin mining using eco-friendly practices in geo-diverse facilities.
Tether Data: We advance AI and peer-to-peer tech with solutions like KEET, our secure data sharing app.
Tether Education: We democratize digital learning to empower individuals in the digital economy.
Tether Evolution: We innovate at the intersection of technology and human potential to shape the future.
Why Join Us?
Our diverse, remote team is passionate about fintech innovation. Join us to work with top talent, set new industry standards, and contribute to a groundbreaking platform. Excellent English communication skills are essential.
Are you ready to be part of the future?
About the job:
As part of the AI model team, you will develop and refine supervised fine-tuning methodologies for advanced models, enhancing their intelligence, performance, and domain-specific capabilities across various systems, from lightweight models to complex multi-modal architectures.
We seek expertise in large language model architectures, fine-tuning optimization, and a hands-on, research-driven approach. Your responsibilities include curating data, improving baseline performance, troubleshooting bottlenecks, and deploying models into production.
Responsibilities:
Minimum requirements: