Skip to content

Commit

Permalink
Merge branch 'main' of github.com:juspay/hyperswitch into iatapay-thr…
Browse files Browse the repository at this point in the history
…ough-hyperswitch-cypress

* 'main' of github.com:juspay/hyperswitch:
  feat(router): add support for googlepay step up flow (#2744)
  fix(access_token): use `merchant_connector_id` in access token (#5106)
  feat: added kafka events for authentication create and update (#4991)
  feat(ci): add vector to handle logs pipeline (#5021)
  feat(users): Decision manager flow changes for SSO (#4995)
  ci(cypress): Fix payment method id for non supported connectors (#5075)
  refactor(core): introduce an interface to switch between old and new connector integration implementations on the connectors (#5013)
  refactor(events): populate object identifiers in outgoing webhooks analytics events during retries (#5067)
  Refactor: [Fiserv] Remove Default Case Handling (#4767)
  chore(version): 2024.06.24.0
  fix(router): avoid considering pre-routing results during `perform_session_token_routing` (#5076)
  refactor(redis): spawn one subscriber thread for handling all the published messages to different channel (#5064)
  feat(users): setup user authentication methods schema and apis (#4999)
  feat(payment_methods): Implement Process tracker workflow for Payment method Status update (#4668)
  chore(version): 2024.06.20.1
  chore(postman): update Postman collection files
  fix(payment_methods): support last used for off session token payments (#5039)
  ci(postman): add net_amount field test cases (#3286)
  refactor(connector): [Mifinity]dynamic fields for mifinity (#5056)
  refactor(payment_method): [Klarna] store and populate payment_type for klarna_sdk Paylater in response (#4956)
  • Loading branch information
pixincreate committed Jun 24, 2024
2 parents 3063b7b + ff84d78 commit a567381
Show file tree
Hide file tree
Showing 215 changed files with 6,208 additions and 1,100 deletions.
44 changes: 44 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,50 @@ All notable changes to HyperSwitch will be documented here.

- - -

## 2024.06.24.0

### Features

- **payment_methods:** Implement Process tracker workflow for Payment method Status update ([#4668](https://github.com/juspay/hyperswitch/pull/4668)) ([`5cde7ee`](https://github.com/juspay/hyperswitch/commit/5cde7ee0344d4068a232c96f60b53629b8c17f7f))
- **users:** Setup user authentication methods schema and apis ([#4999](https://github.com/juspay/hyperswitch/pull/4999)) ([`2005d3d`](https://github.com/juspay/hyperswitch/commit/2005d3df9fc2e559ea65c57892ab940e38b9af50))

### Bug Fixes

- **router:** Avoid considering pre-routing results during `perform_session_token_routing` ([#5076](https://github.com/juspay/hyperswitch/pull/5076)) ([`a71fe03`](https://github.com/juspay/hyperswitch/commit/a71fe033e7de75171d140506ff4d51a362c185f4))

### Refactors

- **redis:** Spawn one subscriber thread for handling all the published messages to different channel ([#5064](https://github.com/juspay/hyperswitch/pull/5064)) ([`6a07e10`](https://github.com/juspay/hyperswitch/commit/6a07e10af379006c4643bb8f0a9cb2f46813ff8a))

**Full Changelog:** [`2024.06.20.1...2024.06.24.0`](https://github.com/juspay/hyperswitch/compare/2024.06.20.1...2024.06.24.0)

- - -

## 2024.06.20.1

### Features

- **cypress:** Add 2 more payout connectors and bank transfer support for payout ([#4993](https://github.com/juspay/hyperswitch/pull/4993)) ([`45a908b`](https://github.com/juspay/hyperswitch/commit/45a908b4407db160b5f92b0bf84a9612cfaf44ef))

### Bug Fixes

- **cypress:** Address cypress skipping tests ([#5046](https://github.com/juspay/hyperswitch/pull/5046)) ([`973ecbf`](https://github.com/juspay/hyperswitch/commit/973ecbf84ec62d05556ccd568992243e460f8b10))
- **payment_methods:** Support last used for off session token payments ([#5039](https://github.com/juspay/hyperswitch/pull/5039)) ([`d98293a`](https://github.com/juspay/hyperswitch/commit/d98293ae9a01ccdcd62466d9c64c6b6f492f227b))

### Refactors

- **connector:** [Mifinity]dynamic fields for mifinity ([#5056](https://github.com/juspay/hyperswitch/pull/5056)) ([`6f58b4e`](https://github.com/juspay/hyperswitch/commit/6f58b4efbd111b2d0b8ef33cbcd377c434f181e2))
- **payment_method:** [Klarna] store and populate payment_type for klarna_sdk Paylater in response ([#4956](https://github.com/juspay/hyperswitch/pull/4956)) ([`c9bfb89`](https://github.com/juspay/hyperswitch/commit/c9bfb89f7eb03d73d5af3fe2bcd7632347ec17b4))
- Introduce ConnectorIntegrationNew and add default implementation for each Connector ([#4989](https://github.com/juspay/hyperswitch/pull/4989)) ([`84bed81`](https://github.com/juspay/hyperswitch/commit/84bed81defce0671274241318204029a2bb30a12))

### Miscellaneous Tasks

- **postman:** Update Postman collection files ([`d546415`](https://github.com/juspay/hyperswitch/commit/d546415c26c2727c22ad0e46dba1c0235501cd2e))

**Full Changelog:** [`2024.06.20.0...2024.06.20.1`](https://github.com/juspay/hyperswitch/compare/2024.06.20.0...2024.06.20.1)

- - -

## 2024.06.20.0

### Features
Expand Down
30 changes: 30 additions & 0 deletions api-reference/openapi_spec.json
Original file line number Diff line number Diff line change
Expand Up @@ -9684,6 +9684,23 @@
"GoPayRedirection": {
"type": "object"
},
"GooglePayAssuranceDetails": {
"type": "object",
"required": [
"card_holder_authenticated",
"account_verified"
],
"properties": {
"card_holder_authenticated": {
"type": "boolean",
"description": "indicates that Cardholder possession validation has been performed"
},
"account_verified": {
"type": "boolean",
"description": "indicates that identification and verifications (ID&V) was performed"
}
}
},
"GooglePayPaymentMethodInfo": {
"type": "object",
"required": [
Expand All @@ -9698,6 +9715,14 @@
"card_details": {
"type": "string",
"description": "The details of the card"
},
"assurance_details": {
"allOf": [
{
"$ref": "#/components/schemas/GooglePayAssuranceDetails"
}
],
"nullable": true
}
}
},
Expand Down Expand Up @@ -9845,6 +9870,11 @@
}
],
"nullable": true
},
"assurance_details_required": {
"type": "boolean",
"description": "Whether assurance details are required",
"nullable": true
}
}
},
Expand Down
28 changes: 16 additions & 12 deletions config/config.example.toml
Original file line number Diff line number Diff line change
Expand Up @@ -587,17 +587,18 @@ enabled = true # Switch to enable or disable PayPal onboard
source = "logs" # The event sink to push events supports kafka or logs (stdout)

[events.kafka]
brokers = [] # Kafka broker urls for bootstrapping the client
intent_analytics_topic = "topic" # Kafka topic to be used for PaymentIntent events
attempt_analytics_topic = "topic" # Kafka topic to be used for PaymentAttempt events
refund_analytics_topic = "topic" # Kafka topic to be used for Refund events
api_logs_topic = "topic" # Kafka topic to be used for incoming api events
connector_logs_topic = "topic" # Kafka topic to be used for connector api events
outgoing_webhook_logs_topic = "topic" # Kafka topic to be used for outgoing webhook events
dispute_analytics_topic = "topic" # Kafka topic to be used for Dispute events
audit_events_topic = "topic" # Kafka topic to be used for Payment Audit events
payout_analytics_topic = "topic" # Kafka topic to be used for Payouts and PayoutAttempt events
consolidated_events_topic = "topic" # Kafka topic to be used for Consolidated events
brokers = [] # Kafka broker urls for bootstrapping the client
intent_analytics_topic = "topic" # Kafka topic to be used for PaymentIntent events
attempt_analytics_topic = "topic" # Kafka topic to be used for PaymentAttempt events
refund_analytics_topic = "topic" # Kafka topic to be used for Refund events
api_logs_topic = "topic" # Kafka topic to be used for incoming api events
connector_logs_topic = "topic" # Kafka topic to be used for connector api events
outgoing_webhook_logs_topic = "topic" # Kafka topic to be used for outgoing webhook events
dispute_analytics_topic = "topic" # Kafka topic to be used for Dispute events
audit_events_topic = "topic" # Kafka topic to be used for Payment Audit events
payout_analytics_topic = "topic" # Kafka topic to be used for Payouts and PayoutAttempt events
consolidated_events_topic = "topic" # Kafka topic to be used for Consolidated events
authentication_analytics_topic = "topic" # Kafka topic to be used for Authentication events

# File storage configuration
[file_storage]
Expand Down Expand Up @@ -644,4 +645,7 @@ enabled = false
global_tenant = { schema = "public", redis_key_prefix = "" }

[multitenancy.tenants]
public = { name = "hyperswitch", base_url = "http:https://localhost:8080", schema = "public", redis_key_prefix = "", clickhouse_database = "default"} # schema -> Postgres db schema, redis_key_prefix -> redis key distinguisher, base_url -> url of the tenant
public = { name = "hyperswitch", base_url = "http:https://localhost:8080", schema = "public", redis_key_prefix = "", clickhouse_database = "default"} # schema -> Postgres db schema, redis_key_prefix -> redis key distinguisher, base_url -> url of the tenant

[user_auth_methods]
encryption_key = "" # Encryption key used for encrypting data in user_authentication_methods table
2 changes: 1 addition & 1 deletion config/dashboard.toml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ surcharge=false
dispute_evidence_upload=false
paypal_automatic_flow=false
threeds_authenticator=false
global_search=false
global_search=true
dispute_analytics=true
configure_pmts=false
branding=false
Expand Down
28 changes: 16 additions & 12 deletions config/deployments/env_specific.toml
Original file line number Diff line number Diff line change
Expand Up @@ -71,17 +71,18 @@ sts_role_session_name = "" # An identifier for the assumed role session, used to
source = "logs" # The event sink to push events supports kafka or logs (stdout)

[events.kafka]
brokers = [] # Kafka broker urls for bootstrapping the client
intent_analytics_topic = "topic" # Kafka topic to be used for PaymentIntent events
attempt_analytics_topic = "topic" # Kafka topic to be used for PaymentAttempt events
refund_analytics_topic = "topic" # Kafka topic to be used for Refund events
api_logs_topic = "topic" # Kafka topic to be used for incoming api events
connector_logs_topic = "topic" # Kafka topic to be used for connector api events
outgoing_webhook_logs_topic = "topic" # Kafka topic to be used for outgoing webhook events
dispute_analytics_topic = "topic" # Kafka topic to be used for Dispute events
audit_events_topic = "topic" # Kafka topic to be used for Payment Audit events
payout_analytics_topic = "topic" # Kafka topic to be used for Payouts and PayoutAttempt events
consolidated_events_topic = "topic" # Kafka topic to be used for Consolidated events
brokers = [] # Kafka broker urls for bootstrapping the client
intent_analytics_topic = "topic" # Kafka topic to be used for PaymentIntent events
attempt_analytics_topic = "topic" # Kafka topic to be used for PaymentAttempt events
refund_analytics_topic = "topic" # Kafka topic to be used for Refund events
api_logs_topic = "topic" # Kafka topic to be used for incoming api events
connector_logs_topic = "topic" # Kafka topic to be used for connector api events
outgoing_webhook_logs_topic = "topic" # Kafka topic to be used for outgoing webhook events
dispute_analytics_topic = "topic" # Kafka topic to be used for Dispute events
audit_events_topic = "topic" # Kafka topic to be used for Payment Audit events
payout_analytics_topic = "topic" # Kafka topic to be used for Payouts and PayoutAttempt events
consolidated_events_topic = "topic" # Kafka topic to be used for Consolidated events
authentication_analytics_topic = "topic" # Kafka topic to be used for Authentication events

# File storage configuration
[file_storage]
Expand Down Expand Up @@ -259,4 +260,7 @@ enabled = false
global_tenant = { schema = "public", redis_key_prefix = "" }

[multitenancy.tenants]
public = { name = "hyperswitch", base_url = "http:https://localhost:8080", schema = "public", redis_key_prefix = "", clickhouse_database = "default"}
public = { name = "hyperswitch", base_url = "http:https://localhost:8080", schema = "public", redis_key_prefix = "", clickhouse_database = "default"}

[user_auth_methods]
encryption_key = "user_auth_table_encryption_key" # Encryption key used for encrypting data in user_authentication_methods table
4 changes: 4 additions & 0 deletions config/development.toml
Original file line number Diff line number Diff line change
Expand Up @@ -598,6 +598,7 @@ dispute_analytics_topic = "hyperswitch-dispute-events"
audit_events_topic = "hyperswitch-audit-events"
payout_analytics_topic = "hyperswitch-payout-events"
consolidated_events_topic = "hyperswitch-consolidated-events"
authentication_analytics_topic = "hyperswitch-authentication-events"

[analytics]
source = "sqlx"
Expand Down Expand Up @@ -654,3 +655,6 @@ global_tenant = { schema = "public", redis_key_prefix = "" }

[multitenancy.tenants]
public = { name = "hyperswitch", base_url = "http:https://localhost:8080", schema = "public", redis_key_prefix = "", clickhouse_database = "default"}

[user_auth_methods]
encryption_key = "A8EF32E029BC3342E54BF2E172A4D7AA43E8EF9D2C3A624A9F04E2EF79DC698F"
6 changes: 5 additions & 1 deletion config/docker_compose.toml
Original file line number Diff line number Diff line change
Expand Up @@ -442,6 +442,7 @@ dispute_analytics_topic = "hyperswitch-dispute-events"
audit_events_topic = "hyperswitch-audit-events"
payout_analytics_topic = "hyperswitch-payout-events"
consolidated_events_topic = "hyperswitch-consolidated-events"
authentication_analytics_topic = "hyperswitch-authentication-events"

[analytics]
source = "sqlx"
Expand Down Expand Up @@ -507,4 +508,7 @@ enabled = false
global_tenant = { schema = "public", redis_key_prefix = "" }

[multitenancy.tenants]
public = { name = "hyperswitch", base_url = "http:https://localhost:8080", schema = "public", redis_key_prefix = "", clickhouse_database = "default"}
public = { name = "hyperswitch", base_url = "http:https://localhost:8080", schema = "public", redis_key_prefix = "", clickhouse_database = "default"}

[user_auth_methods]
encryption_key = "A8EF32E029BC3342E54BF2E172A4D7AA43E8EF9D2C3A624A9F04E2EF79DC698F"
8 changes: 8 additions & 0 deletions config/prometheus.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,3 +35,11 @@ scrape_configs:

static_configs:
- targets: ["otel-collector:8888"]

- job_name: "vector"

# metrics_path defaults to '/metrics'
# scheme defaults to 'http'.

static_configs:
- targets: ["vector:9598"]
134 changes: 134 additions & 0 deletions config/vector.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
acknowledgements:
enabled: true

api:
enabled: true
address: 0.0.0.0:8686

sources:
kafka_tx_events:
type: kafka
bootstrap_servers: kafka0:29092
group_id: sessionizer
topics:
- hyperswitch-payment-attempt-events
- hyperswitch-payment-intent-events
- hyperswitch-refund-events
- hyperswitch-dispute-events
decoding:
codec: json

app_logs:
type: docker_logs
include_labels:
- "logs=promtail"

vector_metrics:
type: internal_metrics

node_metrics:
type: host_metrics

transforms:
plus_1_events:
type: filter
inputs:
- kafka_tx_events
condition: ".sign_flag == 1"

hs_server_logs:
type: filter
inputs:
- app_logs
condition: '.labels."com.docker.compose.service" == "hyperswitch-server"'

parsed_hs_server_logs:
type: remap
inputs:
- app_logs
source: |-
.message = parse_json!(.message)
events:
type: remap
inputs:
- plus_1_events
source: |-
.timestamp = from_unix_timestamp!(.created_at, unit: "seconds")
sinks:
opensearch_events:
type: elasticsearch
inputs:
- events
endpoints:
- "https://opensearch:9200"
id_key: message_key
api_version: v7
tls:
verify_certificate: false
verify_hostname: false
auth:
strategy: basic
user: admin
password: 0penS3arc#
encoding:
except_fields:
- message_key
- offset
- partition
- topic
bulk:
# Add a date prefixed index for better grouping
# index: "vector-{{ .topic }}-%Y-%m-%d"
index: "{{ .topic }}"

opensearch_logs:
type: elasticsearch
inputs:
- parsed_hs_server_logs
endpoints:
- "https://opensearch:9200"
api_version: v7
tls:
verify_certificate: false
verify_hostname: false
auth:
strategy: basic
user: admin
password: 0penS3arc#
bulk:
# Add a date prefixed index for better grouping
# index: "vector-{{ .topic }}-%Y-%m-%d"
index: "logs-{{ .container_name }}-%Y-%m-%d"

log_events:
type: loki
inputs:
- kafka_tx_events
endpoint: http:https://loki:3100
labels:
source: vector
topic: "{{ .topic }}"
job: kafka
encoding:
codec: json

log_app_loki:
type: loki
inputs:
- parsed_hs_server_logs
endpoint: http:https://loki:3100
labels:
source: vector
job: app_logs
container: "{{ .container_name }}"
stream: "{{ .stream }}"
encoding:
codec: json

metrics:
type: prometheus_exporter
inputs:
- vector_metrics
- node_metrics
Loading

0 comments on commit a567381

Please sign in to comment.