Tutorial

LINE MCP Server: Connect AI Agents to LINE with Model Context Protocol

Complete tutorial for connecting AI agents like Claude and Gemini to LINE Official Account using the official LINE Bot MCP Server. Learn setup, configuration, Flex Messages, and natural language bot control with the Model Context Protocol.

LineBot.pro Team15 min read
LINE MCP Server: Connect AI Agents to LINE with Model Context Protocol

#Introduction to MCP and LINE

The way AI agents interact with external services is undergoing a fundamental shift. Until recently, connecting an AI assistant like Claude or Gemini to a messaging platform like LINE required custom webhook code, API wrappers, and significant development effort. That has changed with the Model Context Protocol (MCP) and the official LINE Bot MCP Server.

MCP is now the de facto standard for AI agent integration, with over 97 million monthly SDK downloads and an ecosystem of 5,800+ available servers. Major AI companies including OpenAI, Google, and Microsoft have adopted MCP as their preferred integration protocol. For businesses operating on LINE, this means AI agents can now directly control LINE Official Accounts through natural language -- no custom code required.

#Why This Matters for LINE Businesses

LINE has over 200 million monthly active users across Japan, Thailand, Taiwan, and Indonesia. Businesses using LINE Official Accounts can now leverage AI agents to:

  • Send messages to customers using natural language commands
  • Create rich visual content with Flex Messages automatically
  • Retrieve user profiles and personalize interactions
  • Automate campaign workflows without writing LINE API code
  • Scale customer support by connecting AI assistants directly to LINE

Ready to build AI-powered LINE integrations? Explore our LINE development services to see what is possible.

#What Is the Model Context Protocol?

The Model Context Protocol (MCP) is an open standard originally developed by Anthropic that defines how AI agents communicate with external tools and services. Think of it as a universal adapter -- instead of building custom integrations for every AI model and every service, MCP provides a standardized interface that any AI agent can use.

#How MCP Works

diagram
AI Agent (Claude, Gemini, GPT, etc.)
       |
       v
  MCP Client (built into AI agent)
       |
       v
  MCP Protocol (standardized JSON-RPC)
       |
       v
  MCP Server (LINE Bot MCP Server)
       |
       v
  LINE Messaging API
       |
       v
  LINE Official Account --> Users

#Key Concepts

ConceptDescriptionExample
MCP HostThe AI application that initiates connectionsClaude Desktop, Cursor IDE
MCP ClientProtocol handler inside the hostManages connection lifecycle
MCP ServerService exposing tools via MCPLINE Bot MCP Server
ToolsSpecific actions the server providessend_text_message, send_flex_message
ResourcesData the server can exposeUser profiles, message history

#The MCP Ecosystem in 2026

The MCP ecosystem has grown rapidly since its introduction:

  • 5,800+ available MCP servers covering databases, APIs, SaaS tools, and messaging platforms
  • 97M+ monthly SDK downloads across Python, TypeScript, Java, and Go
  • Adopted by all major AI providers: OpenAI, Google, Microsoft, Anthropic
  • Enterprise-ready: Authentication, authorization, and audit logging built into the protocol
  • Transport options: stdio for local, HTTP with SSE for remote deployments

This standardization means any MCP-compatible AI agent can connect to LINE -- not just one vendor's model. Learn more about LINE API development in our LINE API integration tutorial.

#LINE Bot MCP Server Overview

The LINE Bot MCP Server is an official implementation published by LINE Corporation at github.com/line/line-bot-mcp-server. It bridges the gap between AI agents and LINE's Messaging API, enabling natural language control of LINE Official Accounts.

#Supported Tools

The LINE Bot MCP Server exposes the following tools to AI agents:

ToolDescriptionParameters
push_text_messageSend a text message to a useruser_id, message
push_flex_messageSend a rich Flex Messageuser_id, flex_content, alt_text
broadcast_text_messageSend text to all followersmessage
broadcast_flex_messageBroadcast a Flex Messageflex_content, alt_text
get_profileRetrieve a user's LINE profileuser_id
get_message_quotaCheck remaining message quota--
get_followers_countGet total follower count--

#How It Differs from Traditional Integration

typescript
// Traditional approach: Custom webhook + API code
// You need to write and maintain all of this:

import { Client } from "@line/bot-sdk";

const client = new Client({
  channelAccessToken: process.env.LINE_CHANNEL_ACCESS_TOKEN!,
});

// Handle incoming webhook
app.post("/webhook", async (req, res) => {
  const events = req.body.events;
  for (const event of events) {
    if (event.type === "message" && event.message.type === "text") {
      // Parse intent, generate response, format message...
      await client.replyMessage(event.replyToken, {
        type: "text",
        text: "Hello!",
      });
    }
  }
  res.sendStatus(200);
});
// MCP approach: AI agent controls LINE directly
// Just tell Claude in natural language:

"Send a welcome message to user U1234567890 on LINE
saying 'Welcome to our store! Check out our new spring
collection with 20% off this week.'"

// The AI agent calls push_text_message via MCP automatically

This paradigm shift means business operators, marketers, and non-technical team members can control LINE bots through conversational AI. For more on building intelligent LINE bots, explore our LINE chatbot services.

#Setup & Configuration Guide

Setting up the LINE Bot MCP Server requires two main components: LINE API credentials and the MCP server itself. Here is a complete step-by-step guide.

#Step 1: Create a LINE Official Account and Channel

  1. Go to LINE Developers Console
  2. Create a new Provider (or select an existing one)
  3. Create a new Messaging API Channel
  4. Note your Channel Access Token (long-lived) from the Messaging API tab
  5. Note your Channel Secret from the Basic Settings tab

For a detailed walkthrough, see our guide on LINE Official Account setup.

The simplest way to run the LINE Bot MCP Server is via npx:

bash
# No installation required -- runs directly
npx @line/line-bot-mcp-server

For production deployments, Docker provides better isolation and control:

bash
# Pull the official image
docker pull ghcr.io/line/line-bot-mcp-server:latest

# Run with environment variables
docker run -it --rm \
  -e CHANNEL_ACCESS_TOKEN=your_channel_access_token \
  -e DESTINATION=your_user_id \
  ghcr.io/line/line-bot-mcp-server:latest

#Step 4: Configure with Claude Desktop

Add the MCP server to your Claude Desktop configuration file:

json
// macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
// Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "line-bot": {
      "command": "npx",
      "args": ["-y", "@line/line-bot-mcp-server"],
      "env": {
        "CHANNEL_ACCESS_TOKEN": "your_channel_access_token_here",
        "DESTINATION": "your_default_user_id_here"
      }
    }
  }
}

#Step 5: Configure with Docker in Claude Desktop

json
{
  "mcpServers": {
    "line-bot": {
      "command": "docker",
      "args": [
        "run", "-i", "--rm",
        "-e", "CHANNEL_ACCESS_TOKEN",
        "-e", "DESTINATION",
        "ghcr.io/line/line-bot-mcp-server:latest"
      ],
      "env": {
        "CHANNEL_ACCESS_TOKEN": "your_channel_access_token_here",
        "DESTINATION": "your_default_user_id_here"
      }
    }
  }
}

#Step 6: Verify the Connection

After restarting Claude Desktop, you should see the LINE Bot tools available. Test with a simple command:

You: "Send a test message to LINE saying Hello from AI"

Claude: I'll send that message to your LINE account now.
[Calls push_text_message with message "Hello from AI"]
Message sent successfully!

#Environment Variables Reference

VariableRequiredDescription
CHANNEL_ACCESS_TOKENYesLong-lived token from LINE Developers Console
DESTINATIONNoDefault user ID for push messages

Need help with LINE development setup? Check our LINE app development services.

#Capabilities & Use Cases

Once connected, AI agents can perform powerful operations on your LINE Official Account through natural language commands.

#Sending Text Messages

Prompt: "Send a message to user U1234567890 saying:
Thank you for your order! Your tracking number is TH20260402001.
Expected delivery: April 5, 2026."

// AI agent automatically calls:
// push_text_message({ user_id: "U1234567890", message: "..." })

#Creating and Sending Flex Messages

Flex Messages are LINE's powerful rich message format. With MCP, AI agents can generate and send them from natural language descriptions:

Prompt: "Create a Flex Message for our spring sale with:
- A hero image banner
- Title: Spring Collection 2026
- 20% discount badge
- A 'Shop Now' button linking to https://shop.example.com
Send it to user U1234567890"

// AI agent generates the Flex Message JSON and calls:
// push_flex_message({
//   user_id: "U1234567890",
//   alt_text: "Spring Collection 2026 - 20% Off",
//   flex_content: { type: "bubble", hero: {...}, body: {...}, footer: {...} }
// })

#Retrieving User Profiles

Prompt: "Get the LINE profile for user U1234567890"

// Returns: { displayName, userId, pictureUrl, statusMessage, language }

#Broadcasting Messages

Prompt: "Broadcast a message to all followers:
Happy Songkran! Enjoy 30% off all items this week.
Use code SONGKRAN2026 at checkout."

// AI agent calls broadcast_text_message for all followers

#Real-World Use Case Scenarios

ScenarioAI Agent PromptMCP Tool Used
Order Confirmation"Notify customer about shipped order"push_text_message
Promotional Campaign"Send spring sale banner to all users"broadcast_flex_message
Customer Lookup"Show me this user's profile"get_profile
Personalized Offer"Send birthday coupon to this user"push_flex_message
Quota Check"How many messages can we still send?"get_message_quota
Audience Insight"How many followers do we have?"get_followers_count

Discover how to automate these workflows at scale with our LINE automation services.

#Advanced Integration Patterns

#Combining MCP with Webhook Handlers

For bidirectional communication, combine the MCP server (outbound) with a traditional webhook handler (inbound):

typescript
// app/api/webhooks/line/route.ts
// This handles INCOMING messages from LINE users
import { NextRequest, NextResponse } from "next/server";
import crypto from "crypto";

const CHANNEL_SECRET = process.env.LINE_CHANNEL_SECRET!;

function verifySignature(body: string, signature: string): boolean {
  const hash = crypto
    .createHmac("SHA256", CHANNEL_SECRET)
    .update(body)
    .digest("base64");
  return hash === signature;
}

export async function POST(req: NextRequest) {
  const body = await req.text();
  const signature = req.headers.get("x-line-signature") || "";

  if (!verifySignature(body, signature)) {
    return NextResponse.json({ error: "Invalid signature" }, { status: 401 });
  }

  const { events } = JSON.parse(body);

  for (const event of events) {
    if (event.type === "message" && event.message.type === "text") {
      // Forward to AI agent for processing via MCP
      await forwardToAIAgent(event);
    }
  }

  return NextResponse.json({ status: "ok" });
}

async function forwardToAIAgent(event: LineEvent) {
  // AI agent processes the message and responds via MCP
  // push_text_message or push_flex_message back to the user
  const aiResponse = await processWithAI(event.message.text, event.source.userId);
  // Response is sent back through MCP server automatically
}

#Multi-Agent Architecture

diagram
Customer sends message on LINE
       |
       v
  Webhook Handler (inbound)
       |
       v
  AI Router Agent
       |
       +---> Sales Agent (product inquiries)
       |        |---> MCP: push_flex_message (product cards)
       |
       +---> Support Agent (issue resolution)
       |        |---> MCP: push_text_message (status updates)
       |
       +---> Marketing Agent (campaign management)
                |---> MCP: broadcast_flex_message (promotions)

#Building Custom MCP Tools for LINE

Extend the official server with custom tools for your business:

typescript
// custom-line-mcp-tools.ts
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { z } from "zod";

const server = new McpServer({
  name: "custom-line-tools",
  version: "1.0.0",
});

// Custom tool: Send order status with tracking
server.tool(
  "send_order_status",
  "Send order status update with tracking info to a LINE user",
  {
    user_id: z.string().describe("LINE user ID"),
    order_id: z.string().describe("Order ID"),
    status: z.enum(["confirmed", "shipped", "delivered"]),
    tracking_number: z.string().optional(),
  },
  async ({ user_id, order_id, status, tracking_number }) => {
    const flexMessage = buildOrderStatusFlex(order_id, status, tracking_number);
    await lineClient.pushMessage(user_id, flexMessage);
    return { content: [{ type: "text", text: `Order status sent to ${user_id}` }] };
  }
);

// Custom tool: Send product recommendation carousel
server.tool(
  "send_recommendations",
  "Send personalized product recommendations as a carousel",
  {
    user_id: z.string().describe("LINE user ID"),
    category: z.string().describe("Product category"),
    limit: z.number().default(5),
  },
  async ({ user_id, category, limit }) => {
    const products = await getRecommendations(user_id, category, limit);
    const carousel = buildProductCarousel(products);
    await lineClient.pushMessage(user_id, carousel);
    return { content: [{ type: "text", text: `Sent ${products.length} recommendations` }] };
  }
);

For more advanced chatbot architectures, explore our LINE chatbot development tutorial.

#Best Practices & Security

#Security Considerations

RiskMitigationImplementation
Token ExposureNever hardcode tokensUse environment variables and secrets managers
Unauthorized AccessRestrict MCP server accessRun locally or behind authentication proxy
Message SpoofingVerify webhook signaturesValidate x-line-signature on all inbound webhooks
Rate LimitingRespect LINE API limitsImplement token bucket or sliding window rate limiting
Data LeakageMinimize data in promptsDo not pass sensitive user data to AI agents

#Token Management

typescript
// Secure token management for production
// Use a secrets manager instead of plain environment variables

import { SecretManagerServiceClient } from "@google-cloud/secret-manager";

const client = new SecretManagerServiceClient();

async function getLineToken(): Promise<string> {
  const [version] = await client.accessSecretVersion({
    name: "projects/my-project/secrets/line-channel-token/versions/latest",
  });
  return version.payload?.data?.toString() || "";
}

// Rotate tokens periodically
async function rotateToken(): Promise<void> {
  const newToken = await requestNewChannelToken();
  await client.addSecretVersion({
    parent: "projects/my-project/secrets/line-channel-token",
    payload: { data: Buffer.from(newToken) },
  });
}

#Performance Optimization

  1. Use stdio transport for local development: Fastest connection with minimal overhead
  2. Use HTTP+SSE for remote deployments: Enables multiple concurrent AI agent connections
  3. Cache user profiles: Avoid repeated get_profile calls for the same user
  4. Batch broadcasts wisely: LINE limits broadcast messages per month based on your plan
  5. Monitor message quota: Use get_message_quota before large campaigns

#Deployment Checklist

  • Channel Access Token stored in secrets manager
  • MCP server running in isolated container
  • Webhook signature verification enabled
  • Rate limiting configured for push messages
  • Monitoring and alerting set up for API errors
  • Fallback responses configured for AI agent failures
  • User consent obtained for AI-generated messages
  • Message quota monitoring in place

#Cost Optimization Tips

StrategyImpactDetails
Use free message quota firstSave 100% on first 200-500 messages/monthLINE OA free tier varies by country
Smart broadcastingReduce waste by 40-60%Target segments instead of all followers
Cache AI responsesCut AI API costs by 50%Cache common questions and responses
Batch operationsReduce API calls by 30%Group messages for scheduled delivery

#Getting Started with LineBot.pro

Setting up MCP integration with LINE requires understanding both the AI agent ecosystem and LINE platform APIs. LineBot.pro simplifies this entire workflow, offering pre-built integrations, visual tools, and managed infrastructure.

#What LineBot.pro Offers

  • MCP-Ready Integration: Pre-configured LINE Bot MCP Server with managed credentials and monitoring
  • AI Chatbot Builder: Create intelligent LINE bots with built-in NLP, sentiment analysis, and multilingual support -- no code required
  • Rich Message Generator: Design Flex Messages visually or with AI prompts, then deploy directly to LINE
  • Campaign Automation: Schedule broadcasts, segment audiences, and trigger messages automatically
  • Analytics Dashboard: Real-time insights into message delivery, user engagement, and AI performance

#Plans & Pricing

FeatureFreeStarter (299 THB/mo)Pro (799 THB/mo)
AI Messages50/month500/month2,000/month
MCP IntegrationBasicFullFull + Custom Tools
Flex Messages10/month100/monthUnlimited
Rich Menus15Unlimited
AnalyticsBasicAdvancedPremium

#Start Building Today

  1. Create your free account -- Get 50 free AI credits to start building
  2. Connect your LINE Official Account -- One-click integration with our platform
  3. Enable MCP integration -- Configure AI agent access to your LINE account
  4. Start sending messages -- Use natural language to control your LINE bot

Start your free trial or view pricing plans to find the right plan for your business.

#Additional Resources

LineBot.pro

Ready to Automate Your LINE Business?

Start automating your LINE communications with LineBot.pro today.