Back

Meetily: Building an Open-Source AI-Powered Meeting Assistant

An AI-powered meeting assistant that captures live meeting audio, transcribes it in real-time, and generates summaries while ensuring user privacy. Perfect for teams who want to focus on discussions while automatically capturing and organizing meeting content.

1. The Problem & Inspiration

Why We Built Meetily

Meetings are an essential part of professional collaboration, but capturing and summarizing key points manually is time-consuming and inefficient. While several AI-powered solutions exist, they come with three major challenges:

  1. Cloud Dependence and Privacy Concerns
    • Many users prefer keeping their meeting data local rather than sending sensitive conversations to third-party servers.
    • Enterprises and privacy-conscious organizations cannot rely on closed-source, cloud-based solutions due to data security risks.
  2. High Subscription Costs
    • Most AI-powered meeting assistants require ongoing monthly subscriptions, making them expensive for individual users, startups, and organizations.
    • There are limited open-source alternatives, leaving users with no control over AI-generated outputs.
  3. Lack of Customization and LLM Control
    • Existing tools do not allow users to tweak models, adjust transcription accuracy, or improve AI summaries.
    • Users who wish to deploy smaller, resource-efficient models often lack the flexibility to do so.

Our Mission: A Local AI Meeting Assistant

Based on market queries

  • Open source AI Notetaking app?
  • Looking for the best AI note taking app
  • What AI could help me take notes during a meeting.
  • AI Meeting Minute Tool?
  • What is the best AI Notetaker you’ve used?

Comparison

FeatureMeetilyOtter.aiGranola.ai
Data PrivacyFully open-source and operates locally, ensuring complete data privacy.Cloud-based service; data is stored on external servers, which may raise privacy concerns for sensitive information.Processes audio directly on the user’s device without the need for meeting bots, enhancing privacy.
CostFree to use, with no subscription fees.Offers a free Basic plan with limited features; Pro plan at $8.33 per user/month (billed annually) with enhanced collaboration tools and increased transcription limits. otter.aiFree trial available; Individual plan at $18 per month for unlimited meetings and AI-enhanced notes. granola.ai
CustomizationHighly customizable due to its open-source nature; users can modify and extend features as needed.Provides predefined features with limited customization options.Allows creation of custom note templates to fit various meeting types.
IntegrationDesigned to work locally without the need for external integrations, though plans for cloud sync and integrations are in development.Integrates with platforms like Zoom, Google Meet, and Microsoft Teams to automatically join and transcribe meetings.Works across various platforms without requiring meeting bots; integrates seamlessly with popular meeting platforms.
AI CapabilitiesUtilizes local Large Language Models (LLMs) for transcription and summarization; performance depends on hardware capabilities.Employs cloud-based AI for real-time transcription, automated summaries, and action items; offers features like OtterPilot™ for Sales to extract insights and write follow-up emails.Enhances user-written notes with AI; includes GPT-4 integration to assist with post-meeting action items like writing follow-up emails and listing action items.
AccessibilityRequires local installation and sufficient hardware to run AI models effectively.Accessible via web and mobile apps; does not require significant local resources.Available on all platforms without the need for additional installations; processes audio directly on the device.
User ControlOffers complete control over data and application behavior due to its open-source design.Users have limited control over data handling and application features, as it is a proprietary service.Provides control over note formatting through customizable templates; data is processed locally, offering more control over privacy.

The goal behind Meetily is to build a private, local-first, open-source AI meeting assistant that enables users to:

  • Generate real-time meeting transcriptions
  • Create AI-powered summaries
  • Store data locally for security & privacy
  • Run on local hardware with LLM flexibility

2. Internal Use Cases & Development Journey

How Meetily Enhances Productivity

Meetily is actively used for:

  • Internal meetings, where quick summaries are generated for marketing and operational discussions.
  • Client meetings, allowing for automatic transcription and structured action items.
  • Reducing dependency on third-party SaaS tools, ensuring complete control over API calls and AI outputs.

By integrating Meetily into daily workflows, team members can quickly review and share meeting notes without relying on manual note-taking.

Challenges in Development

  • Local LLM Models Require Optimization
    • While Whisper.cpp provides excellent transcription accuracy, LLM summarization using small models (24B and below) has been inconsistent.
    • Llama 70B (hosted on Groq) has shown promising results, but additional testing with larger models (32B+) is ongoing.
  • Hardware Constraints
    • Running LLMs locally demands high computing resources.
    • To address this, Meetily will soon support hybrid cloud-based model deployments for users who lack powerful hardware.

3. Architecture & Technical Implementation

High-Level System Architecture

Meetily is designed as a modular and extensible AI-powered assistant with the following core components:

  • Frontend (Tauri + Next.js)
    • Provides the user interface and handles real-time data exchange.
  • Backend (FastAPI & Whisper.cpp server)
    • Manages transcription requests, AI summarization, and database interactions.
  • Transcription Engine (Whisper.cpp)
    • Converts audio streams into text with high accuracy.
    • Users can fine-tune parameters to optimize performance.
  • AI Models (Ollama & External LLM APIs)
    • Ollama provides access to local LLM models for on-device summarization.
    • External LLM APIs (Anthropic Claude, Groq, Llama 70B) extend AI capabilities for those who require higher accuracy.
  • Database & Storage
    • SQLite is used for secure local storage of transcripts and summaries.
    • VectorDB supports semantic search and retrieval of past meeting notes.
  • OS & Rust Integration
    • Captures audio streams from microphones from Mac OS and meeting applications.
    • Enables efficient real-time processing for low-latency transcription.

4. Pros and Cons of Local AI Summarization

Key Advantages

  • Privacy-First: No third-party access to meeting data.
  • Cost-Efficient: No subscriptions required.
  • Flexible Model Choices: Supports both local and cloud-based LLMs.
  • Customizable & Open-Source: Users can fine-tune summarization models for specific needs.

Challenges & Considerations

Hardware Requirements: Running LLMs locally requires high-performance computing resources.
Smaller Models Have Lower Accuracy: Optimization efforts are ongoing to improve chunking and retrieval for better summarization.
Setup Complexity: The initial installation had user-reported issues, which have now been addressed with streamlined build scripts.


5. User Feedback & Early Adoption

Community Insights

  • Installation Experience:
    • Early users faced challenges setting up the software.
    • Solution: A simplified installation process with pre-configured builds has been introduced.
  • Feature Requests:
    • Users requested an automated bot for Google Meet & Microsoft Teams that can join meetings and transcribe discussions.
    • Many users expressed interest in automated email summaries being sent to all meeting participants.

Early Adoption Metrics

📈 70+ GitHub Stars in a short period
📈 Users returning to check for updates
📈 Increased interest in open-source, local-first AI solutions


6. Custom MVP Development & Growth Strategy

Phased Development Approach

Phase 1: UI prototype with predefined interactions.
Phase 2: Backend integration for AI-driven summarization and transcription.
Phase 3: Fixing installation bugs and refining user experience.

Community & Outreach Strategy

Engaging on Social Media: Answering common AI-related questions (e.g., “Is there a local AI meeting assistant?”).
Active Discussions on Open-Source Forums.
Encouraging Open-Source Contributions (early adopters have shown strong interest).


7. Future Roadmap & Enhancements

Upcoming Features

🛠 Optimized Chunking for Smaller LLMs to improve summarization accuracy.
🛠 Cloud Deployment Options for those needing external LLM integrations.
🛠 Conversational Chat Feature to allow interactive queries on past meeting transcripts.
🛠 UI/UX Enhancements for a more seamless experience.

Hybrid Model Approach

📌 Users will be able to choose between local LLM models and cloud-based APIs for AI summarization.
📌 Meetily will retain a local-first design philosophy, ensuring privacy-first AI processing.

Long-Term Vision

🚀 Infra4.ai Initiative: Meetily is part of a broader privacy-first AI ecosystem designed to help organizations deploy self-hosted AI solutions.
🚀 Future projects under Infra4.ai will focus on local AI models, self-hosted analytics, and data privacy enhancements.


8. Key Takeaways

💡 Meetily enables local, AI-powered meeting transcription and summarization without privacy risks.
💡 Hybrid LLM support allows users to choose between performance and privacy.
💡 Early traction indicates growing demand for open-source, local-first AI solutions.
💡 Building in public has helped shape development based on real user feedback.


Meetily is redefining the way teams handle meeting transcription and summarization by offering a privacy-first, open-source AI meeting assistant. With its local AI-powered approach, Meetily eliminates cloud dependency, ensuring data security while providing real-time transcription and intelligent summaries.

For businesses looking to develop their own AI-driven productivity tools, Meetily serves as a strong example of Custom MVP Software Development done right. Its iterative MVP Software Design approach—focusing on phased development, user feedback, and scalable architecture—demonstrates how innovative solutions can be built with privacy, flexibility, and user control at their core.

As Meetily continues to evolve with enhanced AI capabilities, hybrid cloud options, and improved UX, it is poised to become the go-to open-source solution for efficient, secure, and customizable AI note-taking. If you’re looking to build your own AI-powered MVP software, Meetily’s development journey provides valuable insights into creating scalable, privacy-first applications.

If you are looking for a fully open-source, private AI meeting assistant, check out Meetily:

🔗 GitHub Repo
🌐 Meetily Website We would love to hear feedback from the open-source and AI community!

Zackriya Solutions
Zackriya Solutions
https://www.zackriya.com