Enable job alerts via email!
A fintech innovation leader is seeking an AI model developer to enhance capabilities in model training. This role involves pre-training on large GPU servers, designing architectures, and refining methods. Candidates should have a degree in Computer Science or a PhD in related fields, with strong experience in LLM training and knowledge of transformer models. Join a dynamic remote team passionate about fintech innovation.
Social network you want to login/join with:
Join Tether and Shape the Future of Digital Finance
At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our solutions empower businesses—from exchanges and wallets to payment processors and ATMs—to seamlessly integrate reserve-backed tokens across blockchains. Using blockchain technology, Tether enables instant, secure, and global digital token transactions at low cost. Transparency ensures trust in every transaction.
Innovate with Tether
Tether Finance: Offers the trusted stablecoin USDT and digital asset tokenization services.
And more:
Tether Power: Optimizes excess power for eco-friendly Bitcoin mining in diverse facilities.
Tether Data: Advances AI and P2P tech with solutions like KEET, our private data sharing app.
Tether Education: Provides digital learning access for individuals in the digital and gig economies.
Tether Evolution: Merges technology and human potential to push innovation boundaries.
Why Join Us?
Work remotely with a global team passionate about fintech innovation. Collaborate with top talent, push boundaries, and set industry standards. If you excel in English communication and want to contribute to a leading platform, Tether is your place.
Are you ready to be part of the future?
About the job:
As part of the AI model team, you will develop architectures for models of various scales, enhancing AI capabilities and efficiency through research-driven techniques, data curation, and resolving pre-training bottlenecks.
Responsibilities:
Require a degree in Computer Science or related field, preferably a PhD in NLP, ML, or similar, with a strong research record. Hands-on experience with large-scale LLM training, distributed frameworks, and deep knowledge of transformer models and PyTorch/Hugging Face libraries is essential.