Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to run Actions on push with basic workflow file. #29338

Closed
CakePost opened this issue Feb 23, 2024 · 17 comments · Fixed by #29495
Closed

Unable to run Actions on push with basic workflow file. #29338

CakePost opened this issue Feb 23, 2024 · 17 comments · Fixed by #29495
Labels

Comments

@CakePost
Copy link

CakePost commented Feb 23, 2024

Description

I have the following configuration for my gitea server.

My runner is successfully identified and connected.
image
The logs for the docker deployment of my runner are as follows, indicating success.

level=info msg="Registering runner, arch=amd64, os=linux, version=v0.2.6."
level=error msg="Cannot ping the Gitea instance server" error="unavailable: 502 Bad Gateway"
level=error msg="Cannot ping the Gitea instance server" error="unavailable: 502 Bad Gateway"
level=error msg="Cannot ping the Gitea instance server" error="unavailable: 502 Bad Gateway"
level=debug msg="Successfully pinged the Gitea instance server"
level=info msg="Runner registered successfully."
SUCCESS
time="2024-02-23T00:11:17Z" level=info msg="Starting runner daemon"
time="2024-02-23T00:11:17Z" level=info msg="runner: 099[REDACTED], with version: v0.2.6, with labels: [ubuntu-latest ubuntu-22.04 ubuntu-20.04 ubuntu-18.04], declare successfully"

My workflow config file looks as follows:

name: Gitea Actions Demo
run-name: This is testing out Gitea Actions
on: push

jobs:
  Explore-Gitea-Actions:
    runs-on: ubuntu-22.04
    steps:
      - run: echo "The job was automatically triggered by a ${{ gitea.event_name }} event."

The log output of my gitea server shows this information when I push to the repo with the workflow:

2024/02/23 00:26:23 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 33.1ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:25 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 32.9ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:27 ...eb/routing/logger.go:102:func1() [I] router: completed GET /[REDACTED]website.git/info/refs?service=git-upload-pack for [REDACTED IP], 401 Unauthorized in 15.9ms @ repo/githttp.go:532(repo.GetInfoRefs)
2024/02/23 00:26:27 ...eb/routing/logger.go:102:func1() [I] router: completed GET /[REDACTED]website.git/info/refs?service=git-upload-pack for [REDACTED IP], 200 OK in 141.5ms @ repo/githttp.go:532(repo.GetInfoRefs)
2024/02/23 00:26:27 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 18.5ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:28 ...eb/routing/logger.go:102:func1() [I] router: completed POST /[REDACTED]website.git/git-upload-pack for [REDACTED IP], 200 OK in 154.5ms @ repo/githttp.go:492(repo.ServiceUploadPack)
2024/02/23 00:26:29 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 24.6ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:31 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 33.8ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:33 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 32.6ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:34 ...eb/routing/logger.go:102:func1() [I] router: completed GET /user/events for [REDACTED IP], 200 OK in 433101.7ms @ events/events.go:18(events.Events)
2024/02/23 00:26:34 ...eb/routing/logger.go:102:func1() [I] router: completed GET /admin for [REDACTED IP], 200 OK in 43.4ms @ admin/admin.go:127(admin.Dashboard)
2024/02/23 00:26:34 ...eb/routing/logger.go:102:func1() [I] router: completed GET /avatar/c43863c87e1d91294c0af8ce37b3d4fa?size=48 for [REDACTED IP], 303 See Other in 124.7ms @ user/avatar.go:48(user.AvatarByEmailHash)
2024/02/23 00:26:35 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 31.8ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:37 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 28.1ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:38 ...eb/routing/logger.go:68:func1() [I] router: polling   GET /user/events for [REDACTED IP], elapsed 3763.7ms @ events/events.go:18(events.Events)
2024/02/23 00:26:39 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 30.6ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:41 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 32.7ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)
2024/02/23 00:26:43 ...eb/routing/logger.go:102:func1() [I] router: completed POST /api/actions/runner.v1.RunnerService/FetchTask for [REDACTED IP], 200 OK in 29.2ms @ <autogenerated>:1(http.Handler.ServeHTTP-fm)

Gitea appears to be picking up the file just fine as the workflow shows up in the UI for the repo.
image

Unfortunately, this workflow is not being triggered when I push to the repo.

I am able to get the workflow to run on try.gitea.io as demonstrated here: https://try.gitea.io/ActionsDebugTest/ActionsDebugTest . This is using the same workflow file as I'm using on my server. Some notable differences are that the repo on my server is nested under an Organization but I don't think I 'm able to make an organization on try.gitea.io. Another notable difference is that the repo on my server has limited visibility. Making the repo completely private still allows the workflow to run.
image

This issue looks a lot like this resolved issue, but the resolution listed doesn't make sense even for that issue #28277 .

@lunny "This is because your labels are not matched." despite it showing that there was clearly a matching labeled runner for their workflow. Am I missing something?

Gitea Version

1.21.6

Can you reproduce the bug on the Gitea demo site?

No

Log Gist

No response

Screenshots

No response

Git Version

2.30.2

Operating System

Debian 10.2.1-6

How are you running Gitea?

I've deployed the action runner using the docker-compose example as described in this documentation. I'm including the compose content for the rest of the gitea instance as well.

version: "3.9"
services:
  gitea:
    image: gitea/gitea:latest
    restart: always
    hostname: [REDACTED IP]
    environment:
      - USER=git
      - USER_UID=1000
      - USER_GID=998
      - GITEA__database__DB_TYPE=postgres
      - GITEA__database__HOST=giteadb:5432
      - GITEA__database__NAME=[REDACTED]
      - GITEA__database__USER=[REDACTED]
      - GITEA__database__PASSWD=[REDACTED]
    networks:
      - gitea
    ports:
      - 3000:3000
      - 22:22
    volumes:
      - /[REDACTED]/gitea/data:/data
      - /etc/timezone:/etc/timezone:ro
      - /etc/localtime:/etc/localtime:ro
    shm_size: 256m
    depends_on:
      - giteadb

  giteadb:
    image: postgres:14
    restart: always
    environment:
      - POSTGRES_USER=[REDACTED]
      - POSTGRES_PASSWORD=[REDACTED]
      - POSTGRES_DB=[REDACTED]
    networks:
      - gitea
    volumes:
      - /[REDACTED]/gitea/db:/var/lib/postgresql/data

  runner:
    image: gitea/act_runner:latest
    restart: always
    depends_on:
      - gitea
    volumes:
      - /[REDACTED]/data/act_runner:/data
      - /var/run/docker.sock:/var/run/docker.sock
    environment:
      - GITEA_INSTANCE_URL=https://[REDACTED].com
      - GITEA_RUNNER_REGISTRATION_TOKEN=[REDACTED]
networks:
  gitea:
    external: false

Database

PostgreSQL

@CakePost
Copy link
Author

I investigated if I could get something working with act_runner directly instead of using Docker. Still unable to get a job to actually run on push... I think one culprit might be my nginx configuration so I'll look into that just to reduce some variables.

@CakePost
Copy link
Author

image

I've looked through my nginx access and error logs and have not been able to find any smoking guns or indications that it shouldn't be working. These connections show the routing for the two runners I have set up; one I'm running locally on my desktop using the act_runner library and the other is deployed using the docker image as described above.

@CakePost
Copy link
Author

I would like to continue investigating but I've run out of ideas for places to look:

  • Gitea logs look fine
  • Runner logs look fine
  • Nginx logs look fine and corroborate the gitea logs
  • Docker logs look fine

Maybe I'm missing a setting to expose some more verbose logs that might be showing the issue? I've looked at the firewall for the server to verify that it wasn't blocking anything- it's pretty permissive with the respective ports entirely open:

  • 22
  • 80
  • 443

@yp05327
Copy link
Contributor

yp05327 commented Feb 27, 2024

IIRC, we have a similar issue before: if the server has no public internet access, workflow will not run.

Edited:
I see you mentioned it, #28277.
That issue, finally, he worked when move Gitea(or runner? not remember clearly) to another server which has public internet access.
But it is hard to say that it is the real reason.
see: #28277 (comment)

@CakePost
Copy link
Author

@yp05327 What exactly qualifies my server as offline, though? Is it that it's deployed in a docker container? The reverse proxy? If connectivity is this issue that suggests that perhaps there's some undocumented ports that need opening?

When I have some time I'll try finding a MWE of this failure perhaps locally.

@yp05327
Copy link
Contributor

yp05327 commented Feb 27, 2024

What I wanted to say is that the root reason of #28277 is not This is because your labels are not matched.
And it seems that the offline server caused it, but this reason is so strange, as workflows should be detected whatever server is online or offline, so I don't think that is the root reason, and #28277 should keep opening.
If your server is online, then maybe it is caused by other reasons or the same unknown reason in #28277.
I will try to contact @zhangbaojia, and try to look into this issue.

@zhangbaojia
Copy link

I haven't followed up on this issue since, but I've continued to use gitea. The previous version of gitea was 1.19.3, and now I am installing 1.20.6. I'll try again this week.

@zhangbaojia
Copy link

I haven't followed up on this issue since, but I've continued to use gitea. The previous version of gitea was 1.19.3, and now I am installing 1.20.6. I'll try again this week.

I still have reservations about servers without WAN

@wolfogre
Copy link
Member

wolfogre commented Feb 28, 2024

I'm sure it's no business about the runner since there are no runs listed on the page.

Unfortunately, I cannot reproduce it in my env.

I would appreciate it if someone could provide trace logs. Please:

  1. Update app.ini to enable trace log:
[log]
LEVEL = trace
  1. Restart Gitea.
  2. Push some commits to the repository which should trigger workflows but not, please push a few more times to trigger more logs.
  3. Finally, paste the logs into this issue. If you trust me, you could send the entire log file to my email. I'll check it line by line.

@wolfogre wolfogre added the topic/gitea-actions related to the actions of Gitea label Feb 28, 2024
@CakePost
Copy link
Author

Thanks for looking into this @wolfogre - I've sent an email with un-redacted trace logs as requested.

@soul-walker
Copy link

I've encountered the same issue. After multiple attempts, I believe the problem here is that the SHA isn't updating during the push, and there are no corresponding push records. It's as if the push didn't trigger the push event. However, I don't understand why it didn't trigger. Interestingly, I couldn't replicate the issue when testing the service locally in the same way as it's done online.

  • new push:
    image
  • SHA is not update:
    image
  • main page don't have push event:
    image

@wolfogre
Copy link
Member

According to the trace log provided by @CakePost, I believe the problem is that the hooks of Git are broken. That's why actions, code indexer, and activities cannot work.

It could be fixed by regenerating the hooks, but before that, I would appreciate it if you could zip the hook directory of the repository and paste it to this issue. Don't worry, all files in it are just some hook scripts generated by Gitea. This will help to figure out what happened.

The location is [GIT_DATE_REPO_DIR]/[owner]/[repo].git/hooks, for example:

  • For @CakePost, it's [GIT_DATE_REPO_DIR]/actionsdebugtest/actionsdebugtest.git/hooks
  • For @soul-walker, it's [GIT_DATE_REPO_DIR]/ceshi1/bj-rtsts-server-go.git/hooks

Then you can fix the hooks by running gitea admin regenerate hooks. If all is well, it should go back to normal. Once the hooks have been fixed, it will be difficult to check what happened. So, please consider giving us a dump following the above steps.

@wxiaoguang
Copy link
Contributor

The FAQ could be updated

https://docs.gitea.com/help/faq#push-hook--webhook-arent-running

Add "Actions don't run"

@yp05327
Copy link
Contributor

yp05327 commented Feb 29, 2024

I haven't followed up on this issue since, but I've continued to use gitea. The previous version of gitea was 1.19.3, and now I am installing 1.20.6. I'll try again this week.

I still have reservations about servers without WAN

@zhangbaojia
Can you check whether you have the same problem mentioned above?
If not, you can also send trace logs to @wolfogre.

@soul-walker
Copy link

According to the trace log provided by @CakePost, I believe the problem is that the hooks of Git are broken. That's why actions, code indexer, and activities cannot work.

It could be fixed by regenerating the hooks, but before that, I would appreciate it if you could zip the hook directory of the repository and paste it to this issue. Don't worry, all files in it are just some hook scripts generated by Gitea. This will help to figure out what happened.

The location is [GIT_DATE_REPO_DIR]/[owner]/[repo].git/hooks, for example:

  • For @CakePost, it's [GIT_DATE_REPO_DIR]/actionsdebugtest/actionsdebugtest.git/hooks
  • For @soul-walker, it's [GIT_DATE_REPO_DIR]/ceshi1/bj-rtsts-server-go.git/hooks

Then you can fix the hooks by running gitea admin regenerate hooks. If all is well, it should go back to normal. Once the hooks have been fixed, it will be difficult to check what happened. So, please consider giving us a dump following the above steps.

Thank you. Here are the hooks files you mentioned. I'll try the steps you suggested.
bj-rtsts-server-go_hooks.tar.gz

@CakePost
Copy link
Author

The FAQ could be updated

https://docs.gitea.com/help/faq#push-hook--webhook-arent-running

Add "Actions don't run"

image

This might be the reason as this particular item is relevant to my setup. Man, trying to host any amount of working data on a CIFS mount may have been a huge mistake. I would have never expected the lack of chmod and permission bit support to be so problematic.

I changed my /etc/fstab for that mount to have file permissions of 700 instead of 600 as a test and low-and-behold the actions have begun running.
image

image

Incidentally, @wolfogre I did not need to gitea admin regenerate hooks for it to start working.

As recommendations, I agree with @wxiaoguang that adding Actions to that FAQ page they linked could catch some debugging efforts for folks that might share some setup details like mine.

I also wish that the logs could include some messaging when the lack of chmod support starts breaking features.

At any rate, I've found the solution for why my Actions weren't running- It seems still possible that @soul-walker might have a different reason for their Actions not running so do follow up with them. I'll of course remain available if there's any other information you'd like to get from me. Thank you all for all your help with this issue, really appreciate all that you do for Gitea ^.^

@soul-walker
Copy link

The FAQ could be updated常见问题解答可能会更新

https://docs.gitea.com/help/faq#push-hook--webhook-arent-running

Add "Actions don't run"添加“操作不运行”

Based on this FAQ , I found out that my issue was caused by the version of the Docker service. After upgrading the version, everything works fine now. Thank you very much!

@wolfogre wolfogre removed the topic/gitea-actions related to the actions of Gitea label Feb 29, 2024
GiteaBot pushed a commit to GiteaBot/gitea that referenced this issue Feb 29, 2024
lunny pushed a commit that referenced this issue Feb 29, 2024
silverwind pushed a commit that referenced this issue Mar 6, 2024
Detect broken git hooks by checking if the commit id of branches in DB
is the same with the git repo.

It can help #29338 #28277 and maybe more issues.

Users could complain about actions, webhooks, and activities not
working, but they were not aware that it is caused by broken git hooks
unless they could see a warning.

<img width="1348" alt="image"
src="https://github.com/go-gitea/gitea/assets/9418365/2b92a46d-7f1d-4115-bef4-9f970bd695da">


It should be merged after #29493. Otherwise, users could see a ephemeral
warning after committing and opening the repo home page immediately.

And it also waits for #29495, since the doc link (the anchor part) will
be updated.

---------

Co-authored-by: wxiaoguang <[email protected]>
Co-authored-by: Giteabot <[email protected]>
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Mar 11, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants