LIMIT and OFFSET
Learning Focus
Use this lesson to understand LIMIT and OFFSET — essential for exploring large datasets and building paginated queries.
Basic Syntax
SELECT columns FROM table LIMIT n;
SELECT columns FROM table LIMIT n OFFSET m;
Examples
Example 1: First 3 Rows
SELECT * FROM employees LIMIT 3;
Example 2: Top N by Salary
SELECT name, salary FROM employees ORDER BY salary DESC LIMIT 3;
Example 3: Pagination with OFFSET
-- Page 1 (rows 1-5)
SELECT * FROM employees ORDER BY id LIMIT 5 OFFSET 0;
-- Page 2 (rows 6-10)
SELECT * FROM employees ORDER BY id LIMIT 5 OFFSET 5;
-- Page 3 (rows 11-15)
SELECT * FROM employees ORDER BY id LIMIT 5 OFFSET 10;
Example 4: Explore Large Files Safely
-- Preview the first 10 rows of a massive CSV
SELECT * FROM read_csv_auto('huge_dataset.csv') LIMIT 10;
Example 5: Sample Data
-- DuckDB TABLESAMPLE (random sample)
SELECT * FROM employees USING SAMPLE 3;
SELECT * FROM employees USING SAMPLE 50 PERCENT;
LIMIT Best Practices
- Always use LIMIT when exploring unknown datasets
- Pair with ORDER BY for deterministic pagination
- Use
TABLESAMPLEfor random sampling without full scans - In DuckDB,
LIMITis pushed down into file scans for efficiency
Common Pitfalls
| Pitfall | Consequence | Prevention |
|---|---|---|
| LIMIT without ORDER BY | Non-deterministic rows returned | Always ORDER BY for consistent pagination |
| Using OFFSET for large pagination | Performance degrades with large offset | Use keyset pagination for large datasets |
| Assuming LIMIT prevents full file scan | For some aggregations, full scan still runs | Use TABLESAMPLE for true sampling |
Quick Reference
SELECT * FROM t LIMIT 10;
SELECT * FROM t LIMIT 10 OFFSET 20;
SELECT * FROM t ORDER BY id LIMIT 5;
SELECT * FROM t USING SAMPLE 100;
SELECT * FROM t USING SAMPLE 10 PERCENT;
What's Next
- Next: NULL Values - Handle NULLs correctly in DuckDB.
- Module Overview - Return to this module index.