Power Up Your AI App with Next.js 15 MCP Chat Client and Composio

Want to create a cutting-edge AI chatbot in Next.js 15? Build an MCP-powered chat client that connects to remote and local servers, rivaling tools like Claude or Windsurf. This 2025 tutorial guides you step-by-step to craft a chatbot that sends emails, creates GitHub issues, or schedules meetings—all with natural language prompts.
You'll get clear code snippets, setup tips, and a checklist to launch your chatbot fast. Ideal for developers aiming to automate workflows or enhance user experiences, this guide has you covered. Let's dive in!
For Beginners: MCP (Model Context Protocol) links AI models to tools like Gmail or Slack for real-time actions. A tool call is when the AI triggers an action, like posting a Slack message.
Why Build a Next.js Chatbot with MCP?
Standard AI chatbots are limited by static data, making them less useful for real-world tasks. MCP (Model Context Protocol) transforms your chatbot by connecting it to external tools for real-time data and actions. A developer on X shared, "MCP turned my Next.js chatbot into a true assistant—automation is a game-changer!"
This tutorial covers:
- Understanding MCP's core components.
- Building a Next.js chat MCP client.
- Connecting to Composio-hosted MCP servers.
- Setting up a local MCP server for custom control.
What is MCP?
MCP bridges AI models with tools, addressing two key limitations:
- Outdated Data: MCP fetches real-time data via APIs.
- No Actions: MCP enables tasks like emailing or creating issues.
Real-World Uses:
- Send emails via Gmail or Outlook.
- Post messages in Slack.
- Create GitHub issues, PRs, or commits.
- Schedule meetings in Google Calendar.
MCP Components:
- Hosts: Apps (e.g., IDEs) that consume MCP data.
- Clients: Your chatbot, connecting to MCP servers.
- Servers: Programs providing tool access via MCP.
- Local Data: Files or databases on your device.
- Remote Services: APIs like Linear or Gmail.
Project Setup
Let's set up the Next.js project and required dependencies.
Prerequisite: Install Node.js 18+ and npm before starting.
Initialize Next.js
Create a new Next.js app:
1npx create-next-app@latest chat-nextjs-mcp-client --typescript --tailwind --eslint --app --src-dir --use-npm
2cd chat-nextjs-mcp-client
Install Dependencies
Add necessary packages:
1npm install composio-core openai dotenv @modelcontextprotocol/sdk
Set Up Composio
For Composio-hosted MCP servers (recommended for testing):
- Sign up at Composio and get an API key.
- Add to .env:
1COMPOSIO_API_KEY=<your_composio_api_key>
2OPENAI_API_KEY=<your_openai_api_key>
- Install Composio CLI:
1npm i -g composio-core
2composio login
3composio whoami
- Add integrations (e.g., Gmail, Linear):
1composio add linear
2composio add gmail
Verify:
1composio integrations
Tip: Use Composio's sandbox to test integrations safely.
Creating the Chat Interface
Build the chat UI in components/chat.tsx:
1'use client';
2
3import { useState } from 'react';
4import { ArrowUpIcon } from 'lucide-react';
5import { Button } from '@/components/ui/button';
6import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip';
7import { AutoResizeTextarea } from '@/components/autoresize-textarea';
8import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
9import { Accordion, AccordionContent, AccordionItem, AccordionTrigger } from '@/components/ui/accordion';
10
11type Message = {
12 role: 'user' | 'assistant';
13 content: string | object;
14 toolResponses?: any[];
15};
16
17export function Chat() {
18 const [messages, setMessages] = useState<Message[]>([]);
19 const [input, setInput] = useState('');
20 const [loading, setLoading] = useState(false);
21
22 const handleSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
23 e.preventDefault();
24 const userMessage: Message = { role: 'user', content: input };
25 setMessages((prev) => [...prev, userMessage]);
26 setInput('');
27 setLoading(true);
28
29 // Temporary loading message
30 setMessages((prev) => [...prev, { role: 'assistant', content: 'Processing...' }]);
31
32 try {
33 const res = await fetch('/api/chat', {
34 method: 'POST',
35 headers: { 'Content-Type': 'application/json' },
36 body: JSON.stringify({ messages: [...messages, userMessage] }),
37 });
38
39 const data = await res.json();
40 const content = typeof data?.content === 'string' ? data.content : 'No server response';
41 const toolResponses = Array.isArray(data?.toolResponses) ? data.toolResponses : [];
42
43 setMessages((prev) => [
44 ...prev.slice(0, -1),
45 { role: 'assistant', content, toolResponses },
46 ]);
47 } catch (err) {
48 setMessages((prev) => [
49 ...prev.slice(0, -1),
50 { role: 'assistant', content: 'Error: Something went wrong.' },
51 ]);
52 } finally {
53 setLoading(false);
54 }
55 };
56
57 const handleKeyDown = (e: React.KeyboardEvent<HTMLTextAreaElement>) => {
58 if (e.key === 'Enter' && !e.shiftKey) {
59 e.preventDefault();
60 handleSubmit(e as unknown as React.FormEvent<HTMLFormElement>);
61 }
62 };
63
64 return (
65 <main className="mx-auto flex h-screen w-full max-w-[35rem] flex-col">
66 <div className="flex-1 overflow-y-auto px-6 py-6">
67 {messages.length === 0 ? (
68 <header className="m-auto max-w-96 text-center">
69 <h1 className="text-2xl font-semibold">MCP-Powered AI Chatbot</h1>
70 <p className="text-sm text-gray-500">Built with Next.js and MCP</p>
71 </header>
72 ) : (
73 <div className="my-4 flex flex-col gap-4">
74 {messages.map((message, index) => (
75 <div key={index} className="flex flex-col gap-2">
76 {message.role === 'assistant' && message.toolResponses?.length > 0 && (
77 <Card>
78 <CardHeader>
79 <CardTitle>Tool Responses</CardTitle>
80 </CardHeader>
81 <CardContent>
82 <Accordion type="multiple">
83 {message.toolResponses.map((toolRes, i) => (
84 <AccordionItem key={i} value={`item-${i}`}>
85 <AccordionTrigger>Tool Call #{i + 1}</AccordionTrigger>
86 <AccordionContent>
87 <pre className="text-sm">{JSON.stringify(toolRes, null, 2)}</pre>
88 </AccordionContent>
89 </AccordionItem>
90 ))}
91 </Accordion>
92 </CardContent>
93 </Card>
94 )}
95 <div
96 className={`max-w-[80%] rounded-xl px-3 py-2 text-sm ${message.role === 'user' ? 'self-end bg-blue-500 text-white' : 'self-start bg-gray-100'}`}
97 >
98 {typeof message.content === 'string' ? message.content : JSON.stringify(message.content, null, 2)}
99 </div>
100 </div>
101 ))}
102 </div>
103 )}
104 </div>
105 <form
106 onSubmit={handleSubmit}
107 className="mx-6 mb-6 flex items-center rounded-[16px] border px-3 py-1.5"
108 >
109 <AutoResizeTextarea
110 onKeyDown={handleKeyDown}
111 onChange={(e) => setInput(e.target.value)}
112 value={input}
113 placeholder="Enter a message"
114 className="flex-1 bg-transparent outline-none"
115 />
116 <Tooltip>
117 <TooltipTrigger asChild>
118 <Button
119 variant="ghost"
120 size="sm"
121 className="absolute right-1"
122 disabled={!input.trim() || loading}
123 >
124 <ArrowUpIcon size={16} />
125 </Button>
126 </TooltipTrigger>
127 <TooltipContent>Send Message</TooltipContent>
128 </Tooltip>
129 </form>
130 </main>
131 );
132}
In components/autoresize-textarea.tsx:
1'use client';
2
3import { cn } from '@/lib/utils';
4import { useRef, useEffect } from 'react';
5
6interface AutoResizeTextareaProps {
7 value: string;
8 onChange: (value: string) => void;
9 className?: string;
10 placeholder?: string;
11}
12
13export function AutoResizeTextarea({ value, onChange, className, placeholder }: AutoResizeTextareaProps) {
14 const textareaRef = useRef<HTMLTextAreaElement>(null);
15
16 const resizeTextarea = () => {
17 const textarea = textareaRef.current;
18 if (textarea) {
19 textarea.style.height = 'auto';
20 textarea.style.height = `${textarea.scrollHeight}px`;
21 }
22 };
23
24 useEffect(() => {
25 resizeTextarea();
26 }, [value]);
27
28 return (
29 <textarea
30 value={value}
31 ref={textareaRef}
32 rows={1}
33 onChange={(e) => {
34 onChange(e.target.value);
35 resizeTextarea();
36 }}
37 placeholder={placeholder}
38 className={cn('resize-none min-h-4 max-h-80', className)}
39 />
40 );
41}
Connecting to Composio MCP Servers
Set up the API route in app/api/chat/route.ts:
1import { NextRequest, NextResponse } from 'next/server';
2import { OpenAI } from 'openai';
3import { OpenAIToolSet } from 'composio-core';
4
5const toolset = new OpenAIToolSet();
6const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
7
8export async function POST(req: NextRequest) {
9 const { messages } = await req.json();
10 const userQuery = messages[messages.length - 1]?.content;
11
12 if (!userQuery) {
13 return NextResponse.json({ error: 'No user query found' }, { status: 400 });
14 }
15
16 try {
17 const tools = await toolset.getTools({ apps: ['gmail', 'linear'] });
18 const fullMessages = [
19 { role: 'system', content: 'You are a helpful assistant with tool access.' },
20 ...messages,
21 ];
22
23 const response = await client.chat.completions.create({
24 model: 'gpt-4o-mini',
25 messages: fullMessages,
26 tools,
27 });
28
29 const aiMessage = response.choices[0].message;
30 const toolCalls = aiMessage.tool_calls || [];
31
32 if (toolCalls.length > 0) {
33 const toolResponses = [];
34 for (const toolCall of toolCalls) {
35 const res = await toolset.executeToolCall(toolCall);
36 toolResponses.push(res);
37 }
38 return NextResponse.json({
39 role: 'assistant',
40 content: 'Tool calls executed successfully 🎉',
41 toolResponses,
42 });
43 }
44
45 return NextResponse.json({
46 role: 'assistant',
47 content: aiMessage.content || 'No server response',
48 });
49 } catch (err) {
50 return NextResponse.json({ error: 'Something went wrong' }, { status: 500 });
51 }
52}
Using a Local MCP Server
For custom tools, set up a local MCP server:
- Clone the sample server:
1git clone --depth 1 --branch custom-fs-mcp-server https://github.com/shricodev/chat-nextjs-mcp-client.git chat-mcp-server
- Build it:
1cd chat-mcp-server && npm run build
2realpath chat-mcp-server/custom-fs-mcp-server/build/index.js
- Start it:
1npm run dev
- In lib/mcp-client/index.ts:
1import { OpenAI } from 'openai';
2import { Client } from '@modelcontextprotocol/sdk/client';
3import { StdioClientTransport } from '@modelcontextprotocol/sdk/client/stdio';
4import dotenv from 'dotenv';
5
6dotenv.config();
7
8const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
9const mcp = new Client({ name: 'nextjs-mcp-client', version: '1.0.0' });
10let tools: any[] = [];
11let connected = false;
12
13export async function initMCP(serverScriptPath: string) {
14 if (connected) return;
15 const command = serverScriptPath.endsWith('.py') ? 'python3' : process.execPath;
16 const transport = new StdioClientTransport({ command, args: [serverScriptPath, '/home/user'] });
17 mcp.connect(transport);
18 const toolsResult = await mcp.listTools();
19 tools = toolsResult.tools.map((tool) => ({
20 type: 'function',
21 function: { name: tool.name, description: tool.description, parameters: tool.inputSchema },
22 }));
23 connected = true;
24}
25
26export async function executeToolCall(toolCall: any) {
27 const toolName = toolCall.function.name;
28 const toolArgs = JSON.parse(toolCall.function.arguments || '{}');
29 const result = await mcp.callTool({ name: toolName, arguments: toolArgs });
30 return { id: toolCall.id, name: toolName, arguments: toolArgs, result: result.content };
31}
32
33export async function processQuery(messages: any[]) {
34 const fullMessages = [{ role: 'system', content: 'You are a helpful assistant.' }, ...messages];
35 const response = await openai.chat.completions.create({ model: 'gpt-4o-mini', messages, tools });
36 const replyMessage = response.choices[0].message;
37 const toolCalls = replyMessage.tool_calls || [];
38
39 if (toolCalls.length > 0) {
40 const toolResponses = [];
41 for (const toolCall of toolCalls) {
42 const toolResponse = await executeToolCall(toolCall);
43 toolResponses.push(toolResponse);
44 fullMessages.push({ role: 'assistant', content: null, tool_calls: [toolCall] });
45 fullMessages.push({ role: 'tool', content: toolResponse.result as string, tool_call_id: toolCall.id });
46 }
47 const followUp = await openai.chat.completions.create({ model: 'gpt-4o-mini', messages: fullMessages });
48 return { reply: followUp.choices[0].message.content || '', toolCalls, toolResponses };
49 }
50 return { reply: replyMessage.content || '', toolCalls: [], toolResponses: [] };
51}
- Update app/api/chat/route.ts:
1import { NextRequest, NextResponse } from 'next/server';
2import { initMCP, processQuery } from '@/lib/mcp-client';
3
4const SERVER_PATH = '<path_to_index.js>';
5
6export async function POST(req: NextRequest) {
7 const { messages } = await req.json();
8 const userQuery = messages[messages.length - 1]?.content;
9
10 if (!userQuery) {
11 return NextResponse.json({ error: 'No query provided' }, { status: 400 });
12 }
13
14 try {
15 await initMCP(SERVER_PATH);
16 const { reply, toolResponses } = await processQuery(messages);
17 return NextResponse.json({ role: 'assistant', content: reply, toolResponses });
18 } catch (err) {
19 return NextResponse.json({ error: 'Something went wrong' }, { status: 500 });
20 }
21}
Testing Your Chatbot
You can test your chatbot by running:
1npm run dev
And then opening http://localhost:3000 in your browser. Test with prompts like "Send a Gmail" or "Create a Linear issue." The chatbot should display tool call results. Case Study: A startup saved 10 hours weekly by automating Slack notifications with this MCP client.
MCP in 2025: What's Next?
Next.js 15's server components streamline MCP integration. Emerging trends include:
- AI agentic workflows for complex tasks.
- Composio's expanding tool marketplace.
- Local servers for enhanced privacy.
Advanced Tip: Cache tool responses server-side to minimize API calls.
FAQ
What is MCP in Next.js? MCP connects AI models to tools for real-time actions.
Should I use Composio or a local MCP server? Composio is faster to set up; local servers offer custom control.
Conclusion
Ready to power up your Next.js 15 app? Build this MCP chat client to automate tasks and delight users. From emails to GitHub PRs, your chatbot will shine. Share your results in the comments or join our developer community (#)! Download our Next.js AI Chatbot Guide (#).