Trending

Content tagged with "ai", "embeddings"

aiembeddings

Hacker News

Top stories from the Hacker News community• Updated 4 minutes ago

Reddit

Top posts from tech subreddits• Updated 1 minute ago

Reddit

AI "Boost" Backfires

i.redd.it
38
28
Formal-Athlete-4241
about 12 hours ago

Hugging Face Trending

Popular models from Hugging Face• Updated 2 minutes ago

Voxtral-Mini-3B-2507

Task: audio-text-to-text

Voxtral-Small-24B-2507

Task: audio-text-to-text

Kimi-K2-Base

Task: text-generation

LFM2-1.2B

Task: text-generation

canary-qwen-2.5b

Task: automatic-speech-recognition

medgemma-27b-it

Task: image-text-to-text

gemma-3n-E4B-it

Task: image-text-to-text

GitHub Trending

Popular repositories from GitHub• Updated 15 minutes ago

n8n

Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.

transformers

🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.

HAMi

Heterogeneous AI Computing Virtualization Middleware(Project under CNCF)

burn

Burn is a next generation Deep Learning Framework that doesn't compromise on flexibility, efficiency and portability.

onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

super-gradients

Easily train or fine-tune SOTA computer vision models with one open source training library. The home of Yolo-NAS.

mindsdb

AI's query engine - Platform for building AI that can answer questions over large scale federated data. - The only MCP Server you'll ever need

qlib

Qlib is an AI-oriented Quant investment platform that aims to use AI tech to empower Quant Research, from exploring ideas to implementing productions. Qlib supports diverse ML modeling paradigms, including supervised learning, market dynamics modeling, and RL, and is now equipped with https://github.com/microsoft/RD-Agent to automate R&D process.

flash-attention

Fast and memory-efficient exact attention