Category: Hands-On

  • Model Context Protocol (MCP)

    Model Context Protocol (MCP)

    Note: This article is still in progress. The content may change until it reaches its final version.

    There is a lot of buzz about a new trend in the AI world. The Model Context Protocol, or MCP for short. In this article I will explain what MCP is.

    LLMs Evolution

    In the “Startup Ideas Podcast”, episode “Model Context Protocol (MCP), clearly explained (why it matters)“, Dr. Ras Mix explained the concept from the perspective of the evolution of LLM and the tools and services surrounding the AI ecosystem. In the following paragraphs, I will summarize it.

    In the beginning, it was just the LLM: trained, with a cut-off knowledge base at a certain point in time. Based on the next token prediction, it could answer questions from the knowledge base, but it won’t be able to perform any tasks outside of what it was trained for.

    In a second phase, LLMs were connected to tools. For example, LLMs start to connect to a search engine, consume and interpret various APIs, and receive additional data knowledge through RAG systems. Being a new field, there is no standard and everyone has integrated these tools in their own way. While this approach makes LLMs smarter, they bring their own set of problems due to lack of standards.

    The current phase, LLMs and MCP, brings a standard way to connect to tools and services through an MCP server that provides a common protocol for LLMs to discover, understand and use the tools available.

    MCP Terminology

    Model – refers to the LLM model in use.

    Context – refers to the context provided by the tools.

    Protocol – refers to this common standard that allows the model (LLM) to understand the context provided by different sources.

  • Manage Your Tasks with ChatGPT Reminders

    Manage Your Tasks with ChatGPT Reminders

    This week, OpenAI introduced a set of task management features for ChatGPT. Now, you can create one-time or recurring reminders, or define more complex tasks that the AI should run at specific intervals.

    The feature is currently rolling out to users on paid plans: Plus, Pro, or Teams. For more details, see the Scheduled tasks in ChatGPT article on the OpenAI help pages.

    Examples of Managing Your Tasks with ChatGPT

    Remind me tomorrow morning at 9am to buy milk
    Can you provide me every day, at 2:15pm, with an executive summary of the last 24 hours news from the domain of Artificial Intelligence (AI)?

    I am particularly excited about prompts like the second example, since I’ll receive a daily briefing on what’s happening in the AI space. It’s a quick way to stay informed without manually checking updates.

    How the Flow Looks in Images

    The model selector:

    • There is a new model option, GPT-4o (scheduled tasks), clearly marked as beta.
    Screenshot showcasing tasks ChatGPT interface and features.

    The prompt and the answer:

    • The usual workflow: you enter your prompt, ChatGPT responds.
    • Look out for the new visual elements indicating that a task has been scheduled.
    Tasks ChatGPT interface showcasing scheduled reminders.

    Task management:

    • After creating a task, you can edit, pause, or view all tasks in a dedicated task manager area.
    Tasks ChatGPT interface showcasing scheduled reminders.

    Notification:

    • When it’s time for your task to run, you’ll receive an email or push notification (depending on your settings).

    My own email alert looked like this:

    Screenshot showcasing tasks ChatGPT interface and features.

    and the chat is updated with this new message (direct link when you follow New Message action from your email):

    Screenshot showcasing tasks ChatGPT interface and features.

    These new scheduling features make it easier to offload your routine reminders and automations to ChatGPT, so you can focus on what truly matters. How do you feel about trusting an AI to handle your daily to-dos? Let me know if you have any creative ideas for using these new tools in your own workflow.