-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Document and export caching of API responses #23
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
add necessary modifications to NAMESPACE and the reference index
5 tasks
…o export_sc_cache
The constructor of the (internal) sc_table class now uses totals=FALSE as a default. This makes add_language() compatible with sc_table_saved()
by default, sc_cache_enable() prints instructions how keys can be set up persistently using message() to keep the log files of our internal packages cleaner, these messages can now be ommited using the verbose parameter
previsously, sc_key was always using the environment variable STATCUBE_KEY to return/store the key value since we have multiple servers (see #25) it is now possible to use - STATCUBE_KEY_EXT for the external server - STATCUBE_KEY_RED for the editing server - STATCUBE_KEY_PROD for the production server part of #25
and auto-detect the appropriate server in cases where this can be inferred from the request payload part of #25
- namespace functions from utils and stats - document all parameters for exported functions - add missing parameter documentation for R6 method sc_table_class$initialize()
Add a parameter that allows to list all datasets from the editing server instead of the external one part of #26
This was
linked to
issues
Mar 7, 2022
add a new "server" parameter that allows to sitch between the external OGD server and the OGD editing server. The cache directory uses a subdirectory to cache the responses from the editing server TODO - add documentation for new server parameter in od_table(), od_resource() and other affected functions - Export functions that return base urls of the different servers. This is needed for a shiny app that is built on top of STATcubeR and links to the datasets via anchor tags - extend the "resource management" vignette and mention how caching from the editing server is handled - add parameter checking and throw errors if server is not "ext" or "red" part of #26
for open data datasets, the FK-column which can hold information about the id of the parent element is now included in the utput of the $field() method part of #28
this caused issues in case time variables were used with exactly one classification level because of speacial treatment TODO: consider fixing this in tabulate() instead since it kind of makes sense to use a total code here?
hopefully, this will make it easier to get sensible defaults that work on all platforms including joke-operating-systems like MS-windows
possibly resolves "unexpected end of input" error
turns out the API keys will have to be requested and it is preferrable to use the internal servers for the STATcube GUI. Details about this are documented in our internal confluence and linked from the keys article
the preferences link only works when you are already loggen in. Otherwise, there will be a page "session expired" mention this in the web-documentation of STATcubeR
This branch now contains support for the external server of the API which will be released on 2022-08-31. To install remotes::install_github("statistikat/STATcubeR", ref = "export_sc_cache") The version after the merge will be |
this server can now be used just like any other server with ```r # provide a key sc_key_set("XXX", server = "test") # send a request sc_schema_catalogue(server = "test") ```
there is now a check to determine wether STATcubeR is used inside the firewall. If not, sc_browse() and sc_browse_prefernces() now forwad to the external STATcube and ensure that the user is logged in when browsing STATcube. This commit also affects the $browse() method of sc_table objects, which will now determine the server from the json request and open the correct deep-link to the databases info page possible TODOs: - throw errors if server is not "ext" and is_stat() is FALSE - use an environment variable in is_stat() rather than relying on Sys.info(). Otherwise, this the browse functions will only work on the two main R servers
use the english menu options to refer to the GUI use a prettier example to showcase the datatree implementation of print.sc_schema()
- explain what remaining and reset means - link to the vendor docs about the endpoints table and schema
link to more related aricles
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
For quite some time there is a hidden feature that allows caching of API responses from the STATcube REST API. This is very useful for our internal web application and we will have to decide how we deal with this in the upcoming CRAN release.
Leaving it as a hidden feature might create a bad impression during reviews. Removing it would make it necessary to implement the caching logic elsewhere, which might be tricky. Therefore, it is probably best to document and export the behavior. Documentation is already available in
?sc_cache
.One problem with the current implementation is that the hashes are created via
serialize()
and therefore they are not reusable in different R versions. It would be very handy to use something likedigest::digest()
but adding another dependency package just for the hashes seems unnecessary. Maybetools::md5sum()
could be used in a different way to get a satisfying result.TODOs
@export
to make the caching available without environment variables@internal
to include the man pages in the index page of the documentationod_cache_summary()
which provides an overview about the cache contents. This would probably require some kind ofcache_index.csv
so we don't need to parse the cache entries.