Job Search and Career Advice Platform

Attiva gli avvisi di lavoro via e-mail!

Data Scientist

JR Italy

Firenze

In loco

EUR 40.000 - 80.000

Tempo pieno

30+ giorni fa

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Descrizione del lavoro

An innovative company is seeking a Data Scientist to join their AI Data Team. In this role, you will leverage machine learning and data analysis to tackle complex business challenges. Collaborating with engineers and developers, you'll design scalable AI solutions and work with cutting-edge technologies. This position offers the chance to engage with large datasets and integrate advanced models into applications, all while being part of a dynamic and rapidly expanding team. If you're passionate about data science and eager to make an impact, this opportunity is perfect for you.

Competenze

  • 2+ years of experience in a data science role with strong ML background.
  • Proficient in Python and SQL for data manipulation and analysis.

Mansioni

  • Design and develop predictive models with the data team.
  • Process large datasets and automate ML workflows.

Conoscenze

Machine Learning
Python
SQL
Statistical Testing
Data Analysis
Problem-Solving

Formazione

STEM Degree (Engineering, Physics, Mathematics, Statistics)

Strumenti

Pandas
NumPy
Scikit-learn
TensorFlow
Keras
PyTorch
Jupyter
Google Colab
AWS
Git
Descrizione del lavoro

Social network you want to login/join with:

The Data Appeal Company is a high-tech company that transforms geo-spatial, sentiment, and market data into compelling, valuable insights that are simple and accessible.

We are a dynamic and rapidly expanding company part of the Almawave Group.

We are looking for a Data Scientist to join our AI Data Team, where you will develop end-to-end solutions to complex business problems using machine learning, statistics, and data analysis.

As part of our team, you will collaborate closely with system engineers, data scientists, front-end developers, and software engineers to design and implement scalable, high-performance AI-driven solutions.

Your Responsibilities:
  1. Design, develop, and evaluate innovative predictive models in collaboration with the data team.
  2. Work with LLM APIs to integrate large language models into data pipelines and applications.
  3. Benchmark and compare different LLMs in terms of accuracy, latency, and cost-performance trade-offs.
  4. Process and analyze large datasets using state-of-the-art machine learning technologies.
  5. Automate data processing workflows and ML model pipelines.
  6. Write reusable, efficient, and scalable code to improve AI-driven processes.
What We’re Looking For:
  • 2 years of experience in a similar role.
  • STEM degree (Engineering, Physics, Mathematics, Statistics).
  • Strong knowledge of statistical testing & inference for KPI extraction from Big Data.
  • Solid expertise in SQL and Python.
  • Familiarity with data management and numerical computing libraries (Pandas, Dask, NumPy, SciPy, Spark, etc.).
  • Experience with ML engines (Jupyter, Google Colab).
  • Knowledge of Machine Learning frameworks (Scikit-learn, NLTK, spaCy).
  • Hands-on experience in data preparation, feature extraction, and engineering.
  • Proficiency in working with LLM APIs (OpenAI GPT, Claude, Mistral, Gemini, Cohere, etc.).
  • Understanding of RESTful APIs, including authentication, rate limits, and response parsing.
  • Experience with model selection and training, particularly transfer learning.
  • A curious, problem-solving mindset and a passion for data science.
Nice-to-Have Skills:
  • Big Data experience.
  • Knowledge of causal inference for spatial and time-series problems.
  • Experience with geospatial analysis for location intelligence.
  • Familiarity with transformer architectures like BERT (Hugging Face).
  • Understanding of LLM deployment, latency optimization, and cost-efficient scaling strategies.
  • Hands-on experience with deep learning frameworks (TensorFlow, Keras, PyTorch).
  • Proficiency in Git and version control best practices.
  • Experience with Agile methodologies.
  • AWS Cloud experience.
Our Tech Stack & Practices:
  • Languages: Java, Kotlin (SpringBoot, Quarkus), Golang, Python (Pandas, Dask, NumPy, SciPy).
  • Machine Learning: Scikit-learn, NLTK, spaCy, TensorFlow, PyTorch.
  • Frontend: TypeScript (Angular, React).
  • Cloud & Infrastructure: AWS, Kubernetes, Terraform, CloudFormation.
  • Big Data: Trino, Spark, Hive.
  • LLM APIs: GPT, Claude, Gemini, Mistral (for NLP, automation, and data processing).
  • AI Practices: Prompt engineering & fine-tuning for domain-specific use cases.
  • Development Approach: Agile methodologies.
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.