Social network you want to login / join with :
Location : Barcelona, Spain
Job Category : Internet
EU work permit required : Yes
Job Reference : ewktk1vn
Job Views : 2
Posted : 14.07.2025
Expiry Date : 28.08.2025
Job Description :
Join Tether and Shape the Future of Digital Finance
At Tether, we’re not just building products, we’re pioneering a global financial revolution. Our solutions enable seamless integration of reserve-backed tokens across blockchains, empowering businesses to store, send, and receive digital tokens securely and instantly worldwide. Transparency ensures trust in every transaction.
Innovate with Tether
Tether Finance : Our product suite includes the trusted stablecoin USDT, used by hundreds of millions, and digital asset tokenization services.
Additional initiatives include :
* Tether Power : Sustainable energy solutions for Bitcoin mining using eco-friendly practices.
* Tether Data : AI and peer-to-peer technology innovations, including KEET, for secure data sharing.
* Tether Education : Digital learning platforms for individuals in the digital and gig economies.
* Tether Evolution : Pushing technological and human potential boundaries for innovative futures.
Why Join Us?
Our global remote team is passionate about fintech innovation. If you excel in English and want to contribute to a leading platform, Tether offers a dynamic environment to grow and make an impact.
Are you ready to be part of the future?
About the job :
As part of the AI model team, you will develop architectures for models of various scales, enhancing AI capabilities and efficiency. Your expertise in LLM architectures and pre-training optimization will drive groundbreaking advancements.
Responsibilities :
* Pre-train AI models on large distributed servers with NVIDIA GPUs.
* Design and prototype innovative architectures.
* Conduct experiments, analyze results, and refine models.
* Improve model efficiency and performance.
* Advance training systems for scalability and efficiency.
Requirements include a degree in Computer Science or related fields, preferably a PhD, with proven experience in large-scale LLM training, frameworks like PyTorch and Hugging Face, and expertise in transformer models.
J-18808-Ljbffr
#J-18808-Ljbffr