✨ Config option to keep tcp
connection alive for a certain period for subsequent requests
#526
Labels
Milestone
tcp
connection alive for a certain period for subsequent requests
#526
Work Expected From The Issue
Provide a config option under the
server
section of the config to allow users to keep tcp connections alive for each request for a certain period of time.The issue expects the following files to be changed:
src/results/aggregator.rs
websurfx/config.lua
Note
All the files that are expected to be changed are located under the codebase (
websurfx
directory).Reason Behind These Changes
The reason behind having these changes is to allow the user to tweak the tcp connection setting to keep it alive (or connected to the server) for the user specifed period of time which can eliminate the cost of creating new connections by eliminating the need to create new connection on each new request to the same server for a certain period of time which can help reduce response latency and improve user experience.
Sample Code
The sample codes for both the files as mentioned above have been provided below:
aggregator.rs
config.lua
-- ### General ### logging = true -- an option to enable or disable logs. debug = false -- an option to enable or disable debug mode. threads = 10 -- the amount of threads that the app will use to run (the value should be greater than 0). -- ### Server ### port = "8080" -- port on which server should be launched binding_ip = "127.0.0.1" --ip address on the which server should be launched. production_use = false -- whether to use production mode or not (in other words this option should be used if it is to be used to host it on the server to provide a service to a large number of users (more than one)) -- if production_use is set to true -- There will be a random delay before sending the request to the search engines, this is to prevent DDoSing the upstream search engines from a large number of simultaneous requests. request_timeout = 30 -- timeout for the search requests sent to the upstream search engines to be fetched (value in seconds). +tcp_connection_keepalive = 30 -- the amount of time the tcp connection should remain alive (or connected to the server). (value in seconds). rate_limiter = { number_of_requests = 20, -- The number of request that are allowed within a provided time limit. time_limit = 3, -- The time limit in which the quantity of requests that should be accepted. } -- ### Search ### -- Filter results based on different levels. The levels provided are: -- {{ -- 0 - None -- 1 - Low -- 2 - Moderate -- 3 - High -- 4 - Aggressive -- }} safe_search = 2 -- ### Website ### -- The different colorschemes provided are: -- {{ -- catppuccin-mocha -- dark-chocolate -- dracula -- gruvbox-dark -- monokai -- nord -- oceanic-next -- one-dark -- solarized-dark -- solarized-light -- tokyo-night -- tomorrow-night -- }} colorscheme = "catppuccin-mocha" -- the colorscheme name which should be used for the website theme -- The different themes provided are: -- {{ -- simple -- }} theme = "simple" -- the theme name which should be used for the website -- The different animations provided are: -- {{ -- simple-frosted-glow -- }} animation = "simple-frosted-glow" -- the animation name which should be used with the theme or `nil` if you don't want any animations. -- ### Caching ### redis_url = "redis:https://127.0.0.1:8082" -- redis connection url address on which the client should connect on. cache_expiry_time = 600 -- This option takes the expiry time of the search results (value in seconds and the value should be greater than or equal to 60 seconds). -- ### Search Engines ### upstream_search_engines = { DuckDuckGo = true, Searx = false, Brave = false, Startpage = false, LibreX = false, Mojeek = false, Bing = false, } -- select the upstream search engines from which the results should be fetched.
The text was updated successfully, but these errors were encountered: