Articles for category: AI Tools

Create a Stunning Password Strength Analyzer with Real-Time Feedback

Have you ever struggled with creating a secure password that meets all the requirements? Or wondered if your current password is strong enough to withstand attacks? In this tutorial, I’ll walk you through building a sleek Password Strength Analyzer that provides instant visual feedback and security insights. What We’re Building We’ll create an interactive web app that: Analyzes password strength in real-time Shows detailed feedback on security criteria Checks if your password has been exposed in data breaches Estimates how long it would take to crack the password Looks amazing with a modern UI and smooth animations You can try

Towards a Tagalog NLP pipeline

Update (2023-12-06): I am happy to announce that two papers came out from this project: “Developing a Named Entity Recognition Dataset for Tagalog” and “calamanCy: A Tagalog Natural Language Processing Toolkit”. Tagalog is my native language. It’s spoken by 76 million Filipinos and has been the country’s official language since the 30s. It’s a text-rich language, but unfortunately, a low-resource one. In the age of big data and large language models, building NLP pipelines for Tagalog is still difficult. In this blog post, I’ll talk about how I built a named-entity recognition (NER) pipeline for Tagalog. I’ll discuss how I

Unlocking Longer Generation with Key-Value Cache Quantization

At Hugging Face, we are excited to share with you a new feature that’s going to take your language models to the next level: KV Cache Quantization. TL;DR: KV Cache Quantization reduces memory usage for long-context text generation in LLMs with minimal impact on quality, offering customizable trade-offs between memory efficiency and generation speed. Have you ever tried generating a lengthy piece of text with your language model, only to hit a wall because of pesky memory limitations? As language models continue to grow in size and capabilities, supporting longer generations can start to really eat up memory. It’s a

Nomic Embed Code: A State-of-the-Art Code Retriever

Nomic Embed Code: A State-of-the-Art Code Retriever. Nomic have released a new embedding model that specializes in code, based on their CoRNStack “large-scale high-quality training dataset specifically curated for code retrieval”. The nomic-embed-code model is pretty large – 26.35GB – but the announcement also mentioned a much smaller model (released 5 months ago) called CodeRankEmbed which is just 521.60MB. I missed that when it first came out, so I decided to give it a try using my llm-sentence-transformers plugin for LLM. llm install llm-sentence-transformers llm sentence-transformers register nomic-ai/CodeRankEmbed --trust-remote-code Now I can run the model like this: llm embed -m

Installing Deephaven without Docker | Deephaven

Typically, when you start Deephaven, you would go through these steps, which use Docker and Docker Compose to set up a Deephaven server and its dependencies. However, running inside of Docker is not the only way you can set up Deephaven. Today, I’m going to walk through the steps necessary to get a Deephaven Community Core server running on a fresh install of Linux. We have now simplified deployment via a single application and Deephaven can more easily be run without Docker. If you have any questions, reach out on our Community Slack. Here’s the kernel I just installed on

OMSCS CS6460 (Education Technology) Review and Tips

You might also be interested in this OMSCS FAQ I wrote after graduation. Or view all OMSCS related writing here: omscs. I recently completed the OMSCS course on Education Technology and found it to be one of the most innovative courses I’ve taken. There is no pre-defined curriculum and syllabus, though there are many videos and materials available. Learners have the autonomy and freedom to view the course videos and materials in any order and at their own pace. The course is focused around a big project, and learners pick up the necessary knowledge and skills as they progress on

[Tutorial] Chapter 5: Tabs & Dynamic blocks

Hello, everyone! Welcome to Chapter 5! This chapter is packed with exciting content as we expand functionality in the task management page, enabling various viewing options. I bet you’ve been looking forward to this, right? No worries — I’ll guide you step-by-step, and as always, we’ll breeze through it together! 5.1 Tab Containers for Organizing Blocks We’ve already set up a task management page, but to make the system even more intuitive, we want tasks to be viewable in different modes, such as Table, Kanban, Calendar, and even Gantt Chart. Using NocoBase’s tab feature, we can toggle between these different

explosion/radicli: 🕊️ Radically lightweight command-line interfaces

radicli is a small, zero-dependency Python package for creating command line interfaces, built on top of Python’s argparse module. It introduces minimal overhead, preserves your original Python functions and uses type hints to parse values provided on the CLI. It supports all common types out-of-the-box, including complex ones like List[str], Literal and Enum, and allows registering custom types with custom converters, as well as custom CLI-only error handling, exporting a static representation for faster --help and errors and auto-generated Markdown documentation. Important note: This package aims to be a simple option based on the requirements of our libraries. If you’re

Introducing Spaces Dev Mode for a seamless developer experience

Hugging Face Spaces makes it easy for you to create and deploy AI-powered demos in minutes. Over 500,000 Spaces have been created by the Hugging Face community and it keeps growing! As part of Hugging Face Spaces, we recently released support for “Dev Mode”, to make your experience of building Spaces even more seamless. Spaces Dev Mode lets you connect with VS Code or SSH directly to your Space. In a click, you can connect to your Space, and start editing your code, removing the need to push your local changes to the Space repository using git. Let’s see how

GPT-4o got another update in ChatGPT

GPT-4o got another update in ChatGPT. This is a somewhat frustrating way to announce a new model. @OpenAI on Twitter just now: GPT-4o got an another update in ChatGPT! What’s different? Better at following detailed instructions, especially prompts containing multiple requests Improved capability to tackle complex technical and coding problems Improved intuition and creativity Fewer emojis 🙃 This sounds like a significant upgrade to GPT-4o, albeit one where the release notes are limited to a single tweet. ChatGPT-4o-latest (2025-0-26) just hit second place on the LM Arena leaderboard, behind only Gemini 2.5, so this really is an update worth knowing