Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Datapackage descriptors annotating SQLite DBs are invalid #153

Open
zaneselvans opened this issue Oct 8, 2023 · 3 comments
Open

Datapackage descriptors annotating SQLite DBs are invalid #153

zaneselvans opened this issue Oct 8, 2023 · 3 comments
Labels
metadata Managing data about our data

Comments

@zaneselvans
Copy link
Member

zaneselvans commented Oct 8, 2023

The datapackage descriptors we are currently generating to annotate the SQLite DBs which are derived from XBRL data are not valid. For example, in the ferc-xbrl-extractor environment running this command:

frictionless validate ferc714_xbrl_datapackage.json

Results in a bunch of errors like:

# -------
# invalid: sqlite:https:////Users/zane/code/catalyst/pudl-work/output/ferc714_xbrl.sqlite
# -------

## Summary

+-----------------------------+--------------------------------------------------------------------------+
| Description                 | Size/Name/Count                                                          |
+=============================+==========================================================================+
| File name (Not Found)       | sqlite:https:////Users/zane/code/catalyst/pudl-work/output/ferc714_xbrl.sqlite |
+-----------------------------+--------------------------------------------------------------------------+
| File size                   | N/A                                                                      |
+-----------------------------+--------------------------------------------------------------------------+
| Total Time Taken (sec)      | 0.002                                                                    |
+-----------------------------+--------------------------------------------------------------------------+
| Total Errors                | 1                                                                        |
+-----------------------------+--------------------------------------------------------------------------+
| Scheme Error (scheme-error) | 1                                                                        |
+-----------------------------+--------------------------------------------------------------------------+

## Errors

+-------+---------+---------+---------------------------------------------------+
| row   | field   | code    | message                                           |
+=======+=========+=========+===================================================+
|       |         | scheme- | The data source could not be successfully loaded: |
|       |         | error   | cannot create loader "". Try installing           |
|       |         |         | "frictionless-"                                   |
+-------+---------+---------+---------------------------------------------------+

Or if we try and validate a single resource and return the errors in JSON form:

frictionless validate --json --resource-name planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_table_03_2_instant ferc714_xbrl_datapackage.json
{
  "version": "4.40.11",
  "time": 0.001,
  "errors": [],
  "tasks": [
    {
      "resource": {
        "path": "sqlite:https:////Users/zane/code/catalyst/pudl-work/output/ferc714_xbrl.sqlite",
        "profile": "tabular-data-resource",
        "name": "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_table_03_2_instant",
        "dialect": {
          "table": "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_table_03_2_instant"
        },
        "title": "03.2 - Schedule - Planning Area Hourly Demand and Forecast Summer and Winter Peak Demand and Annual Net Energy for Load, Table - instant",
        "description": "ferc:SchedulePlanningAreaHourlyDemandAndForecastSummerAndWinterPeakDemandAndAnnualNetEnergyForLoadBAbstract",
        "format": "sqlite",
        "mediatype": "application/vnd.sqlite3",
        "schema": {
          "fields": [
            {
              "name": "entity_id",
              "title": "Entity Identifier",
              "type": "string",
              "format": "default",
              "description": "Unique identifier of respondent"
            },
            {
              "name": "filing_name",
              "title": "Filing Name",
              "type": "string",
              "format": "default",
              "description": "Name of filing"
            },
            {
              "name": "date",
              "title": "Instant Date",
              "type": "date",
              "format": "default",
              "description": "Date of instant period"
            },
            {
              "name": "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_axis",
              "title": "Planning Area Hourly Demand and Forecast Summer and Winter Peak Demand and Annual Net Energy for Load [Axis]",
              "type": "string",
              "format": "default",
              "description": "Typed dimension used to distinguish a set of related facts about planning area hourly demand and forecast summer and winter peak demand and annual net energy for load."
            }
          ],
          "primary_key": [
            "entity_id",
            "filing_name",
            "date",
            "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_axis"
          ]
        },
        "scheme": "",
        "hashing": "md5",
        "stats": {
          "hash": "",
          "bytes": 0
        }
      },
      "time": 0.001,
      "scope": [],
      "partial": false,
      "errors": [
        {
          "code": "scheme-error",
          "name": "Scheme Error",
          "tags": [],
          "note": "cannot create loader \"\". Try installing \"frictionless-\"",
          "message": "The data source could not be successfully loaded: cannot create loader \"\". Try installing \"frictionless-\"",
          "description": "Data reading error because of incorrect scheme."
        }
      ],
      "stats": {
        "errors": 1
      },
      "valid": false
    }
  ],
  "stats": {
    "errors": 1,
    "tasks": 1
  },
  "valid": false
}

The problem?

I think the issue here is that we are using v4 of the frictionless package, and the ability to annotate SQLite DBs was only introduced in v5. Looking at the datapacakge.json file, I see that tie dialect field is invalid. In frictionless v5, it would need to say sql and and then point at the table within the sql dictionary. in previous versions it would describe the CSV dialect that's being used in the file that the path element points at. See this example of data package annotating an SQLite DB.

{
            "path": "sqlite:https:////Users/zane/code/catalyst/pudl-work/output/ferc714_xbrl.sqlite",
            "profile": "tabular-data-resource",
            "name": "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_table_03_2_instant",
            "dialect": {
                "table": "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_table_03_2_instant"
            },
            "title": "03.2 - Schedule - Planning Area Hourly Demand and Forecast Summer and Winter Peak Demand and Annual Net Energy for Load, Table - instant",
            "description": "ferc:SchedulePlanningAreaHourlyDemandAndForecastSummerAndWinterPeakDemandAndAnnualNetEnergyForLoadBAbstract",
            "format": "sqlite",
            "mediatype": "application/vnd.sqlite3",
            "schema": {
                "fields": [
                    {
                        "name": "entity_id",
                        "title": "Entity Identifier",
                        "type": "string",
                        "format": "default",
                        "description": "Unique identifier of respondent"
                    },
                    {
                        "name": "filing_name",
                        "title": "Filing Name",
                        "type": "string",
                        "format": "default",
                        "description": "Name of filing"
                    },
                    {
                        "name": "date",
                        "title": "Instant Date",
                        "type": "date",
                        "format": "default",
                        "description": "Date of instant period"
                    },
                    {
                        "name": "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_axis",
                        "title": "Planning Area Hourly Demand and Forecast Summer and Winter Peak Demand and Annual Net Energy for Load [Axis]",
                        "type": "string",
                        "format": "default",
                        "description": "Typed dimension used to distinguish a set of related facts about planning area hourly demand and forecast summer and winter peak demand and annual net energy for load."
                    }
                ],
                "primary_key": [
                    "entity_id",
                    "filing_name",
                    "date",
                    "planning_area_hourly_demand_and_forecast_summer_and_winter_peak_demand_and_annual_net_energy_for_load_axis"
                ]
            }
        }

Frictionless v4 can't interpret sqlite:https:// URL as path

As it is, the system sees the sqlite:https:// URL in the path and has no idea how to interpret it to find the data.

The sqlite:https:// path must be relative not absolute

In addition to being unable to interpret the sqlite:https:// URL as a path at all, the URL uses an absolute path rather than a relative path, which is invalid. It is invalid both in that it violates the frictionless data resource specification which says:

A “url-or-path” is a string with the following additional constraints:

  • MUST either be a URL or a POSIX path
  • URLs MUST be fully qualified. MUST be using either http or https scheme. (Absence of a scheme indicates MUST be a POSIX path)
  • POSIX paths (unix-style with / as separator) are supported for referencing local files, with the security restraint that they MUST be relative siblings or children of the descriptor. Absolute paths (/) and relative parent paths (…/) MUST NOT be used, and implementations SHOULD NOT support these path types.

In addition, the absolute path is simply wrong if you download our nightly build outputs since the path to which the descriptor and databases were written on the build server have no meaning on the user's machine:

sqlite:https:////home/catalyst/pudl_work/output/ferc1_xbrl.sqlite

What to do?

  • If we want to annotate SQLite DBs, I think we need to update to Frictionless v5 and follow the datapackage standards for annotating SQL sources, including using a relative path to the DB from the location of the datapackage.json file.
  • We should be validating every datapackage that we output during the nightly builds and in the integration tests for this repository.
  • If we want to, I think we can annotate all of the SQLite DBs and Parquet outptus using a single large datpackage descriptor.
@zaneselvans zaneselvans added the metadata Managing data about our data label Oct 8, 2023
@zaneselvans zaneselvans changed the title Datapackage descriptors are invalid Datapackage descriptors annotating SQLite DBs are invalid Oct 8, 2023
@zschira
Copy link
Member

zschira commented Oct 12, 2023

@zaneselvans thanks for investigating this. We do have an integration test that is supposed to check for valid datapackage's, but clearly it needs to be overhauled. We currently only test the Ferc1 datapackage, and it uses the Package property metadata_valid to check for validity, which is maybe insufficient?

@zaneselvans
Copy link
Member Author

IIRC that check will probably only look at the Package-level metadata, and not recurse down into any of the resources which make up the package (which confused the heck out of me when I was first working with the datapackage validations).

@zschira
Copy link
Member

zschira commented Jul 17, 2024

This came up again in #242 and I tried to fix it with no luck. I've made the paths relative, but I've run into another error for all of the tables: Please provide "dialect.sql.table" for reading. I spent some time messing around with the dialect, but couldn't resolve the issue, so I'm going to leave this open for the time being.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
metadata Managing data about our data
Projects
Status: Backlog
Development

No branches or pull requests

2 participants