Dr. Ing. Pratyush Agnihotri
Working area(s)
Future Data Systems, DFKI
Contact
pratyush.agnihotri@dfki.de
Work
S2|02 A322
Hochschulstr. 10
64289
Darmstadt
Links
I am a Senior Researcher in the Systems AI for Decision Support (SAIDE) research department at the German Research Center for Artificial Intelligence (DFKI). My research centers on building efficient, adaptive, and intelligent data processing systems with a particular focus on distributed stream processing, resource-aware system design, and AI-driven performance optimization. In parallel, I work as a Postdoctoral Researcher in the Systems Group at TU Darmstadt. I completed my Ph.D. in the Department of Electrical Engineering and Information Technology (ETiT) at TU Darmstadt, where my work focused on AI-based performance driven optimization methods and benchmarking for parallel and distributed stream processing systems operating in heterogeneous environments. During Ph.D. I worked as a Research Scientist at the Multimedia Communication Lab (KOM) at TU Darmstadt, contributing to the C2 subproject of the DFG Collaborative Research Center MAKI. Before starting my Ph.D., I worked in industry as an Associate Team Lead at axxessio GmbH, engaging in applied R&D projects in IoT, Smart City platforms, and voice-based assistants. Overall, I bring nearly 10 years of industry experience, bridging academic research and real-world systems development.
Research Interests:
- Distributed and Resource-efficient Stream Processing
- Autonomous Resource Management
- Performance Benchmarking and Optimization
- Heterogenous Cloud Environment
- Internet of Things (IoT)
Previous Labs:
I am looking for motivated students who are interested to work on AI-based Data Management and Streaming Systems. If you are interested to do write a Bachelor or Master Thesis or gain experience in my current ongoing projects as HiWi then please feel free to contact me. Send your CV and Transcript list on my email address.
Conversational Data Analysis with Large Language Models and the Model Context Protocol (MCP)
This thesis explores how Large Language Models (LLMs) can enable intuitive, conversational interaction with complex data analyses through the Model Context Protocol. Students will investigate how users with different expertise levels (simple, medium, and advanced requests) can “talk to their data” naturally within an MCP-based platform. The work will also identify current limitations of the MCP architecture and propose architectural or interface-level improvements for enhanced usability and scalability.
Cloud vs. Local LLMs in the MCP Framework: A Comparative Study
This project evaluates how smaller, locally deployable LLMs perform compared to cloud-based models in comparison to LLM+MCP framework. The focus is on assessing analysis quality, latency, and usability when performing complex analytical tasks. The work also includes a critical evaluation of LLMs and LLMs+MCP’s architectural limitations based on these experiments and recommendations for improving its design and adaptability for hybrid (cloud–edge) environments.
Error on loading data
An error has occured when loading publications data from TUbiblio. Please try again later.
-
; {{ creator.name.family }}, {{ creator.name.given }}{{ publication.title }}.
; {{ editor.name.family }}, {{ editor.name.given }} (eds.); ; {{ creator }} (Corporate Creator) ({{ publication.date.toString().substring(0,4) }}):
In: {{ publication.series }}, {{ publication.volume }}, In: {{ publication.book_title }}, In: {{ publication.publication }}, {{ publication.journal_volume}} ({{ publication.number }}), ppp. {{ publication.pagerange }}, {{ publication.place_of_pub }}, {{ publication.publisher }}, {{ publication.institution }}, {{ publication.event_title }}, {{ publication.event_location }}, {{ publication.event_dates }}, ISSN {{ publication.issn }}, e-ISSN {{ publication.eissn }}, ISBN {{ publication.isbn }}, DOI: {{ publication.doi.toString().replace('http://','').replace('https://','').replace('dx.doi.org/','').replace('doi.org/','').replace('doi.org','').replace("DOI: ", "").replace("doi:", "") }}, Official URL, {{ labels[publication.type]?labels[publication.type]:publication.type }}, {{ labels[publication.pub_sequence] }}, {{ labels[publication.doc_status] }} - […]