Проект присвячується захисникам міста Бахмут
Written by inspiration to explore Yggdrasil ecosystem, because of last YaCy node there was discontinued. This engine also could be useful for crawling regular websites, small business resources, local networks.
The project goal - simple interface, clear architecture and lightweight server requirement.
https://github.com/YGGverse/YGGo/tree/main/media
php8^
php-dom
php-xml
php-pdo
php-curl
php-gd
php-mbstring
php-zip
php-mysql
php-memcached
memcached
sphinxsearch
git clone https://github.com/YGGverse/YGGo.git
cd YGGo
composer install
- Server configuration
/example/environment
- The web root dir is
/src/public
- Deploy the database using MySQL Workbench project presented in the
/database
folder - Install Sphinx Search Server
- Configuration examples presented at
/config
folder - Make sure
/src/storage/cache
,/src/storage/tmp
,/src/storage/snap
folders are writable - Set up the
/src/crontab
by following example - To start crawler, add at least one initial URL using search form or CLI
Build third party applications / index distribution.
Could be enabled or disabled by API_ENABLED
option
/api.php
Returns search results.
Could be enabled or disabled by API_SEARCH_ENABLED
option
GET action=search - required
GET query={string} - optional, search request, empty if not provided
GET type={string} - optional, filter mime type of available or empty
GET page={int} - optional, search results page, 1 if not provided
GET mode=SphinxQL - optional, enable extended SphinxQL syntax
Returns hosts collected with fields provided in API_HOSTS_FIELDS
option.
Could be enabled or disabled by API_HOSTS_ENABLED
option
GET action=hosts - required
Returns node information for other nodes that have same CRAWL_MANIFEST_API_VERSION
and DEFAULT_HOST_URL_REGEXP
conditions.
Could be enabled or disabled by API_MANIFEST_ENABLED
option
GET action=manifest - required
word prefix:
yg*
operator OR:
hello | world
operator MAYBE:
hello MAYBE world
operator NOT:
hello -world
strict order operator (aka operator "before"):
aaa << bbb << ccc
exact form modifier:
raining =cats and =dogs
field-start and field-end modifier:
^hello world$
keyword IDF boost modifier:
boosted^1.234 boostedfieldend$^1.234
https://sphinxsearch.com/docs/current.html#extended-syntax
Could be enabled with following attributes
GET m=SphinxQL
- Web pages full text ranking search
- Sphinx
- Unlimited content MIME crawling
- Flexible settings compatible with IPv4/IPv6 networks
- Extended search syntax support
- Compressed page history snaps with multi-provider storage sync
- Local (unlimited locations)
- Remote FTP (unlimited mirrors)
- Privacy-oriented downloads counting, traffic controls
- CSS only, JS-less interface
- Unique host ident icons
- Content MIME tabs (#1)
- Page index explorer
- Meta
- Snaps history
- Referrers
- Top hosts page
- Safe media preview
- Results with found matches highlight
- The time machine feature by content snaps history
- Index API
- Manifest
- Search
- Hosts
- Snaps
- Context advertising API
- Auto crawl links by regular expression rules
- Pages
- Manifests
- Robots.txt / robots meta tags support (#2)
- Specific rules configuration for every host
- Auto stop crawling on disk quota reached
- Transactions support to prevent data loss on queue failures
- Distributed index crawling between YGGo nodes trough manifest API
- MIME Content-type settings
- Ban non-condition links to prevent extra requests
- Debug log
- Index homepages and shorter URI with higher priority
- Collect target location links on page redirect available
- Collect referrer pages (redirects including)
- URL aliasing support on PR calculation
- Host page DOM elements collecting by CSS selectors
- Custom settings for each host
- XML Feeds support
- Sitemap
- RSS
- Atom
- Palette image index / filter
- Crawl queue balancer, that depends of CPU available
- Networks integration
- Banned pages reset by timeout
- DB tables optimization
*CLI interface still under construction, use it for your own risk!
- help
- db
- optimize [x] crontab
- crawl
- clean
- hostSetting
- get
- set
- list
- delete
- flush
- hostPage
- add
- rank
- reindex
- hostPageSnap
- repair
- db
- fs
- reindex
- truncate
- repair
- Administrative panel for useful index moderation
- Deployment tools
- Testing
- Documentation
Please make a new branch of main|sqliteway tree for each patch in your fork before create PR
git checkout main
git checkout -b my-pr-branch-name
See also: SQLite tree
- Engine sources MIT License
- Home page animation by alvarotrigo
- CLI logo by patorjk.com
- Identicons by jdenticon
Feel free to share your ideas and bug reports!