[ [ { "page": 1, "text": "CHEUN DA\n(438) 410-0783 | cedricdacheun@gmail.com |Montreal| LinkedIn | Website | Github\n___________________________________________________________________________________________\nSummary of Qualifications\n● Experience in designing, testing, evaluating, and deploying machine learning model-based solutions\n● Proficient in Python, Pandas, and Matplotlib for data exploration and visualization\n● Experience with POWER BI and Tableau for creating interactive dashboards, providing real-time insights to\nstakeholders\n● Strong customer service skills, ensuring clear communication and quick problem resolution\n● Excellent written and oral communication skills in both French and English\nTechnical Skills\nProgramming & Data:: Python, JavaScript (Node.js),Java\nAI/ML Frameworks : PyTorch, TensorFlow, Hugging Face Transformers, OpenAI, LangChain,\nLlamaIndex\nAutomation Tool: n8n, Cursor(IDE)\nLLM Operations: Fine-tuning, RAG pipelines, prompt optimization, agent design\nData Engineering: Chroma, Pinecone, Weaviate, FAISS, PostgreSQL\nCloud Platforms: AWS (S3, EC2, SageMaker), Azure ML, GCP Vertex AI\nAPIs & Tools: FastAPI, Flask, Gradio, Streamlit, Docker, Git, RESTful APIs\nNLP Techniques: Text classification, summarization, sentiment analysis, entity recognition,\nembeddings\nDevOps & MLOps: CI/CD, model monitoring, containerized deployments\nLanguages: French, English (bilingual)\nWork Experience\nApplied AI Researcher January 2022 to July 2024\nUQTR Lab | Trois-Rivières (Quebec)\n● Designed and deployed large-scale machine learning pipelines using Python, Scikit-Learn, TensorFlow, and Pandas,\nachieving >99.9% model accuracy for network intrusion and fraud detection systems.\n● Implemented data preprocessing and batch processing workflows, improving computation efficiency by 40%.\n● Utilized GANs and Gaussian SMOTE to address class imbalance in large datasets.\n● Optimized ML models with feature selection algorithms and meta-feature engineering, reducing features by 45%\nwhile maintaining model precision.\n● Built and deployed a RAG-based chatbot (Retriever-Augmented Generation) using LangChain, OpenAI, and LLAMA,\nintegrating real-time external data APIs.\n● Developed a multi-agent monitoring system for RSS data streams, enhancing alert precision and latency by 60%.\n● Deployed prototypes using Docker and Kubernetes on AWS for scalability and CI/CD integration.\nFull Stack Developer February 2020 to August 2020\nTravaris | Tunis, Tunisie\n● Built high-performance web and mobile apps (ReactJS, React Native, Node.js) optimized for 200ms response time.\n● Designed and deployed REST APIs (Feathers.js, GraphQL) with caching and SQL data storage, supporting 100k+\nreq/min at 99.9% uptime.\n● Integrated automated web scraping and data ingestion pipelines with Selenium and SQL.\nWeb & Mobile Full Stack Developer Jun 2019 to August 2019\nPERFORM VR │ France, Montpellier\nAssisted the team in migrating an Android app to iOS using Swift and Xcode, optimizing performance to ensure a smooth\nexperience with response times under 100 ms on iPhone and iPad\n● Assisted in cross-platform app migration (Android → iOS) using Swift and OpenGL, achieving <100ms latency.\n● Implemented testing pipelines (unit, integration, regression), ensuring 95% test success rate.", "source": "resume.pdf" }, { "page": 2, "text": "● Managed version control on GitLab, ensuring smooth versioning with a continuous delivery cycle.\nEducation & Certifications\nMaster's in Applied Mathematics and Computer Science Janvier 2022 a Juillet 2024\nUniversité du Québec à Trois-Rivières (UQTR) │ Trois-Rivières, Quebec\nBachelor's Degree in Computer Engineering January 2015 to December 2020\nESPRIT École Supérieure Privée d'Ingénierie et de Technologies│ Tunis, Tunisia\nData Analyst Program – NPower Canada August 2025 to October 2025\nIBM Certified Data Analyst Professional October 2025\nMicrosoft Azure AI Certification September 2025\nAWS Solution Architect Professional Certification February 2025\nAWS Cloud Certification November 2024\nVolunteer Experience\nVolunteer at Afromusée September 2024 to December 2024\nAfromusée │ Montreal, QC\n● Contributed to the planning and organization of cultural events and conferences, ensuring smooth execution and a\n20% increase in attendance compared to previous editions\n● Welcomed participants at events and conferences, providing a warm and professional experience, which resulted in\na 95% satisfaction rate in post-event evaluations\n● Actively participated in networking nights every Sunday, facilitating exchanges and collaborations between\nparticipants, contributing to new partnerships and cultural projects. These events helped strengthen community\nbonds and increased engagement by 30%\n● Assisted during museum exhibits, providing enriching information to visitors, contributing to an enhanced\neducational experience and increasing visitor retention rates", "source": "resume.pdf" } ], [ { "page": 1, "text": "Contact\ncedricdacheun@gmail.com\nwww.linkedin.com/in/cheun-da\n(LinkedIn)\ncheunanthony.github.io/ (Portfolio)\nTop Skills\nWeb Scraping\nDatabases\nData Manipulation\nLanguages\nFrench (Native or Bilingual)\nFrançais (Native or Bilingual)\nAnglais (Professional Working)\nEnglish (Full Professional)\nCertifications\nAWS Certified Solutions Architect –\nAssociate\nSwift Programming\nAzure Databricks & Spark For Data\nEngineers \nPython Datascience Toolbox\nCloud training\nCheun DA\nData Analyst|Data Scientist/Machine Learning Engineer|Certified\nAWS Solution Architect Associate|Driving Business Value Through\nInsights, Scalable Data Systems & AI\nMontreal, Quebec, Canada\nSummary\nI’ve always believed that data tells a story, and my passion lies in\nuncovering those stories to drive smarter decisions.\n– I enjoy exploring datasets, identifying trends, and transforming raw\nnumbers into clear, actionable insights that support smarter business\ndecisions.\n- I’m driven by the challenge of building reliable data pipelines,\noptimizing storage, and ensuring data is accessible, scalable, and\nhigh quality.\n– I am excited by predictive modeling, AI, and advanced algorithms\nthat reveal hidden patterns and enable automation, personalization,\nand innovation.\n- I see every challenge as an opportunity to build smarter, more\nefficient data ecosystems that power innovation.\nExperience\nUniversité du Québec à Trois-Rivières\nApplied researcher in AI\nJanuary 2022 - July 2024 (2 years 7 months)\nTrois-Rivières, Quebec, Canada\nApplied AI Researcher with a strong focus on Anomaly Detection, Natural\nLanguage Processing (NLP), and a keen interest in Computer Vision and\nFinance applications of Machine Learning. Over 2 years of experience\nin designing, implementing, and deploying machine learning models to\nsolve real-world problems such as intrusion detection, fraud detection, and\ndrowsiness detection.\n  Page 1 of 5", "source": "linked_page.pdf" }, { "page": 2, "text": "Developed two advanced frameworks for intrusion detection in networks and\ncredit card fraud detection, using\nSKlearn, Numpy, Pandas, Matplotlib, Seaborn, and TensorFlow, achieving\nmodel accuracies above 99.96%\n● Solved data imbalance using techniques like GAN, Gaussian SMOTE to\nimprove machine learning model performance\n● Optimized machine learning models by adding meta-features, applying\nselection techniques such as particle swarm\noptimization, reducing overfitting, and reducing computational cost with feature\nreduction from 87 to 47\n● Created multi-agent LLMs to monitor RSS feeds for detecting promotional\noffers, with the ability to send detailed\nnotifications about products found. Additionally, developed an insurance\ndomain-specific chatbot based on the RAG\n(Retriever-Augmented Generation) model, using Python, LangChain, OpenAI,\nLLAMA, and Claude. This chatbot\nprovides accurate and personalized responses based on real-time external\ndata\n● Developed a real-time drowsiness detection system using TensorFlow,\nYOLO, and OpenCV\nSERF Burkina\nIT consultant at SERF\nAugust 2021 - December 2021 (5 months)\nOuagadougou, Burkina Faso\nThe project involved the creation of a dynamic web platform aimed at\nenhancing the visibility of SERF Company and facilitating its transition into the\ndigital realm. The platform is designed to provide seamless user experience,\nimprove operational efficiency, and streamline job offer management.\nKey Responsibilities:\nAs an IT Consultant for this project, my role encompassed both technical and\nstrategic tasks aimed at ensuring the successful development and deployment\nof the web platform. My key contributions included:\nRESTful Web Service Development:\nDeveloped and integrated RESTful APIs to enable seamless communication\nbetween the front-end and back-end components of the platform. This ensured\nscalability, performance, and the ability to support future integrations.\nAdministrator Interface Development:\n  Page 2 of 5", "source": "linked_page.pdf" }, { "page": 3, "text": "Designed and implemented an intuitive administrator interface, empowering\ninternal teams to efficiently manage platform content, monitor user activities,\nand oversee job offer postings. This interface featured robust access control\nand streamlined workflows.\nJob Offer Management System:\nDeveloped a comprehensive job offer management system, allowing users to\neasily post, edit, and track job offers. This system was integrated with backend\ndatabases to ensure real-time updates and accurate tracking.\nMultilingual Platform Implementation:\nLed the translation and localization of the platform into French and English,\nensuring accessibility and a seamless experience for users across different\nlinguistic backgrounds. Implemented internationalization (i18n) best practices\nto ensure scalability for future languages.\nDeployment & Security:\nOversaw the deployment of the platform on a secure VPS (Virtual Private\nServer), ensuring that the system was optimized for performance and fully\nsecured against potential vulnerabilities. Managed server configuration,\ndatabase integration, and platform stability to ensure a smooth launch.\nTravaris\nFull stack React js and React native developper\nFebruary 2020 - August 2020 (7 months)\nTunisia\nDeveloped and implemented a high-performance web and mobile app with\nReactJS and React Native, allowing users (tourists and travelers) to view\ndetailed information about places (hotels, parks, museums, etc.) in a country\nvia an interactive map based on OpenStreetMap. The app achieved an\naverage response time of under 200 ms for loading place information.\nIntegrated Selenium for web scraping, enabling automated collection of data\non over 10,000 tourist destinations. The extracted data was pre-processed and\nstored efficiently in an SQL database, with a 95% real-time data update rate.\nDeveloped an optimized REST API with Feathers.js, combined with GraphQL\nand an intelligent caching system, reducing query response times by 50%\ncompared to a traditional architecture. The API can handle up to 100,000\nrequests per minute with 99.9% uptime.\n  Page 3 of 5", "source": "linked_page.pdf" }, { "page": 4, "text": "PERFORM VR\nFull Stack iOS Developer\nJune 2019 - August 2019 (3 months)\nRégion de Montpellier, France\nAssisted the team in migrating an Android app to iOS using Swift and Xcode,\noptimizing performance to ensure a smooth experience with response times\nunder 100 ms on iPhone and iPad.\nDesigned the app using Adobe XD, integrating OpenGL and Google VR for an\nimmersive virtual reality experience, allowing users to burn up to 500 calories\nper 30-minute session, maintaining 60 fps rendering fluidity.\nImplemented unit, integration, and regression tests to ensure maximum\nstability with a test success rate of over 95% across all platforms. Continuously\ndebugged to optimize performance and ensure memory usage under 30 MB\nper session.\nManaged version control on GitLab, ensuring smooth versioning with a\ncontinuous delivery cycle. Created two types of releases: a demo version\noptimized for a quick trial experience and a full version offering advanced\nfeatures, with updates every two weeks to improve user experience.\nXtensus\nFull Stack JEE & Symfony Developer\nJune 2017 - July 2018 (1 year 2 months)\nGovernorate of Tunis, Tunisia\n-Implemented microservices architecture for better scalability and\nmaintainability of the platform\n-Built automated CI/CD pipelines with GitLab CI to ensure continuous\ndeployment of the application to AWS EC2 instances\n-Utilized Docker to containerize the application, enabling consistent\nenvironments across development, testing, and production\n-Developed and optimized REST APIs using Caching and GraphQL for\nmanaging orders, users, payments, and inventory\n-Front-End Development (Symfony-based)\n-Back-End Development (JEE/Spring & Symfony)\nEducation\n  Page 4 of 5", "source": "linked_page.pdf" }, { "page": 5, "text": "NPower\nData analyst certification  · (August 2025 - November 2025)\nUniversité du Québec à Trois-Rivières\nMaster's degree in Applied Mathematics and Computer Science  · (January\n2022 - December 2024)\nEcole Supérieure Privée d'Ingénierie et de Technologies - ESPRIT\nBac+4,  Engineering computer Science · (2015 - 2020)\nESPRIT\nInformatique\n  Page 5 of 5", "source": "linked_page.pdf" } ], [ { "source": "github_readmes", "file": "portfolio.md", "text": "# Portfolio Projects\n\n## KnowBot \n- Description: Knowbot is designed to be an intelligent assistant that can seamlessly interact with your organization’s internal knowledge base. Whether it's through voice or text queries, this assistant taps into your existing documents, policies, research papers, and other content to provide accurate, up-to-date, and contextually relevant answers. It blends retrieval-augmented generation (RAG) for data sourcing and a large language model (LLM) for sophisticated reasoning and answer generation.\n- Tech: Python, Openai, Gradio, Whisper, LLM\n\n## Multi-Agent Financial Advisor System\n- Description: This project is a production-ready multi-agent AI financial advisory system designed to deliver holistic, personalized financial guidance.\nIt integrates investment advisory, tax optimization, and retirement planning into a unified intelligent platform.\n\nThe system leverages:\n\n Specialized AI agents coordinated by an Orchestrator Agent\n Real-time market intelligence\n Advanced Retrieval-Augmented Generation (RAG) with hybrid search\n Structured financial data pipelines\n Educational visual generation using OpenAI Images\n\nThe advisor interprets a user’s full financial profile (income, assets, liabilities, goals, risk tolerance) and produces coherent, actionable financial plans while resolving conflicts between investment growth, tax efficiency, and long-term retirement objectives.\n- Tech: Python, Gradio, Langchain, Vector-Database, Semantic Search, RAG\n\n## AzureQbot\n- Description: A modern React frontend for a Knowledge Base Chatbot hosted on Azure. Features a sleek chat interface with Markdown rendering, avatars, message timestamps, dark mode, and seamless integration with a Python backend API. \n- Tech: React, FastAPI, Azure, QA, Knowledgebase\n\n## AI Brochure Generator\n- Description: In this project, we built a highly efficient brouchure generator using LLM. We also added the option of translating the generated brochure. The solution process can be divided into three main stages. To build our graphical interface we used gradio wich is an open-source Python framework that simplifies the creation of interactive web interfaces for machine learning models, APIs, or any Python function\n\n The first step is to build a website scrapper that can retrieve the content of a given url website.\n After scrapping the website, we'll send the useful website content to an LLM model. The LLM model will generate a brochure by summarizing and extracting the useful information. We choose Chatgpt and Claude to do this task.\n The final step is to send the generated brochure to another LLM for translation into the desired language.In our context, we decided to translate it into French.\n\n- Tech: LLM, Anthropic, Gradio, BeautifulSoup\n\n## Deal Finder\n- Description: In this project we built an advanced Multi agent that subscribes to RSS feeds, check for a new opportunity deal (product), when the Multi agent finds a good deal it returns a notification containing the title, description, price, and url of the products it found.\n- Tech: LLM, RAG, Langchain\n\n## Insurance Chatbot \n\n- Description: In this project, we built an AI insurance ChatBot to help respond to customers. In order to build our AI insurance ChatBot, we used an LMI that was combined with a knowledge base using the technique known as RAG. The first step is to use a knowledge base containing information about the insurance company. We extract important context from this knowledge base. We will then vectorize the data from the knowledge base to produce better queries for our data. To prevent the LLM from making a mistake and returning the wrong answer to the client, we instruct the LLM to return the answer if it doesn't exist in the knowledge base.\n- Tech: LLM, GPT, RAG, ChromaDB" } ] ]