fix: add robots.txt and remove *.txt from gitignore
- Add robots.txt to repository for SEO compliance - Remove *.txt from .gitignore as it was blocking robots.txt - Robots.txt allows crawling except for /api/ paths
This commit is contained in:
@@ -23,7 +23,6 @@
|
|||||||
|
|
||||||
# debug
|
# debug
|
||||||
*.log
|
*.log
|
||||||
*.txt
|
|
||||||
npm-debug.log*
|
npm-debug.log*
|
||||||
yarn-debug.log*
|
yarn-debug.log*
|
||||||
yarn-error.log*
|
yarn-error.log*
|
||||||
|
|||||||
@@ -0,0 +1,6 @@
|
|||||||
|
User-agent: *
|
||||||
|
Disallow: /api/
|
||||||
|
Allow: /
|
||||||
|
|
||||||
|
# Sitemap
|
||||||
|
# Sitemap: https://your-domain.com/sitemap.xml
|
||||||
Reference in New Issue
Block a user