This commit is contained in:
淮新 2024-04-10 14:32:49 +08:00
parent f69d28a1c9
commit e7771e8f08
237 changed files with 20768 additions and 1 deletions

8
.gitignore vendored Normal file
View File

@ -0,0 +1,8 @@
#logs/
.git
__pycache__
*.pyc
*.pyo
build
dist
.vscode

201
API.md Normal file
View File

@ -0,0 +1,201 @@
## 环境变量
```python
# 同步任务执行完成后,是否删除同步目录
DELETE_SYNC_DIR = getenv('DELETE_SYNC_DIR', False)
# 是否在日志中详细记录git执行错误时的信息
LOG_DETAIL = getenv('LOG_DETAIL', True)
# 同步目录环境变量
SYNC_DIR = os.getenv("SYNC_DIR", "/tmp/sync_dir/")
```
## 仓库绑定
允许用户通过此接口绑定仓库信息。
- **URL**`/cerobot/sync/repo`
- **Method**`POST`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必填 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
| enable | bool | true/false | yes | 同步状态 |
| internal_repo_address | string | | yes | 内部仓库地址 |
| external_repo_address | string | | yes | 外部仓库地址 |
| sync_granularity | enum('all', 'one') | 1 为仓库粒度的同步<br />2 为分支粒度的同步 | yes | 同步粒度 |
| sync_direction | enum('to_outer', 'to_inter') | 1 表示内部仓库同步到外部<br />2 表示外部仓库同步到内部 | yes | 同步方向 |
### 请求示例
```json
{
"enable": true,
"repo_name": "ob-robot-test",
"internal_repo_address": "",
"external_repo_address": "",
"sync_granularity": 2,
"sync_direction": 1
}
```
## 分支绑定
允许用户通过此接口在对应仓库上绑定分支。
- **URL**`/cerobot/sync/{repo_name}/branch`
- **Method**`POST`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
| enable | bool | true/false | yes | 同步状态 |
| internal_branch_name | string | | yes | 内部分支名称 |
| external_branch_name | string | | yes | 外部分支名称 |
### 请求示例
```json
"repo_name": "ob-robot-test"
{
"enable": true,
"internal_branch_name": "test",
"external_branch_name": "test"
}
```
## 仓库粒度同步
允许用户通过此接口执行单个仓库同步。
- **URL**`/cerobot/sync/repo/{repo_name}`
- **Method**`POST`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
### 成功响应
**条件**:同步执行成功。<br />**状态码:**`0 操作成功`<br />**响应示例**
```json
{
"code_status": 0,
"data": null,
"msg": "操作成功"
}
```
### 错误响应
**条件**:同步执行未成功。<br />**状态码:**`2xxxx 表示git异常错误`<br />**响应示例**
```json
{
"code_status": 20009,
"data": null,
"msg": "分支不存在"
}
```
## 分支粒度同步
允许用户通过此接口执行单个分支同步。
- **URL**`/cerobot/sync/{repo_name}/branch/{branch_name}`
- **Method**`POST`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
|-------------| --- | --- | --- |-------|
| repo_name | string | | yes | 仓库名称 |
| sync_direct | int | 1/2 | yes | 同步方向:<br/>1 表示内部仓库同步到外部<br />2 表示外部仓库同步到内部 |
| branch_name | string | | yes | 分支名称 |
注: 仓库由内到外同步时,分支输入内部仓库分支名;仓库由外到内同步时,分支输入外部仓库分支名;
### 成功响应
**条件**:同步执行成功。<br />**状态码:**`0 操作成功`<br />**响应示例**
```json
{
"code_status": 0,
"data": null,
"msg": "操作成功"
}
```
### 错误响应
**条件**:同步执行未成功。<br />**状态码:**`2xxxx 表示git异常错误`<br />**响应示例**
```json
{
"code_status": 20009,
"data": null,
"msg": "分支不存在"
}
```
## 获取仓库信息
允许用户通过此接口分页获取仓库信息。
- **URL**`/cerobot/sync/repo`
- **Method**`GET`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| page_num | int | | no | 页数 |
| page_size | int | | no | 条数 |
| create_sort | bool | | no | 创建时间排序, 默认倒序 |
## 获取分支信息
允许用户通过此接口分页获取仓库信息。
- **URL**`/cerobot/sync/{repo_name}/branch`
- **Method**`GET`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
| page_num | int | | no | 页数 |
| page_size | int | | no | 条数 |
| create_sort | bool | | no | 创建时间排序, 默认倒序 |
## 仓库解绑
允许用户通过此接口解绑对应仓库信息,该仓库下的分支也全部解绑。
- **URL**`/cerobot/sync/repo/{repo_name}`
- **Method**`DELETE`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
## 分支解绑
允许用户通过此接口解绑对应仓库的分支信息。
- **URL**`/cerobot/sync/{repo_name}/branch/{branch_name}`
- **Method**`DELETE`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
| branch_name | string | <br /> | yes | 分支名称 |
注: 仓库由内到外同步时,分支输入内部仓库分支名;仓库由外到内同步时,分支输入外部仓库分支名;
## 仓库同步状态更新
允许用户通过此接口更新仓库的同步状态。
- **URL**`/cerobot/sync/repo/{repo_name}`
- **Method**`PUT`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
| enable | bool | true/false | yes | 分支名称 |
## 分支同步状态更新
允许用户通过此接口更新对应仓库的分支同步状态。
- **URL**`/cerobot/sync/{repo_name}/branch/{branch_name}`
- **Method**`PUT`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
| --- | --- | --- | --- | --- |
| repo_name | string | | yes | 仓库名称 |
| branch_name | string | | yes | 分支名称 |
| enable | bool | true/false | yes | 分支名称 |
注: 仓库由内到外同步时,分支输入内部仓库分支名;仓库由外到内同步时,分支输入外部仓库分支名;
## 日志信息获取
允许用户通过此接口获取仓库/分支的同步日志。
- **URL**`/cerobot/sync/repo/{repo_name}/logs`
- **Method**`GET`
### 请求参数body
| 参数 | 类型 | 示例输入 | 是否必须 | 说明 |
|-----------|--------| --- |------|------|
| repo_name | string | | yes | 仓库名称 |
| branch_id | int | | no | 分支id |
注: 获取仓库粒度的同步日志时无需输入分支id

27
Dockerfile Normal file
View File

@ -0,0 +1,27 @@
FROM centos:7
RUN yum update -y && \
yum install -y wget gcc make openssl-devel bzip2-devel libffi-devel zlib-devel
RUN wget -P /data/ob-tool https://www.python.org/ftp/python/3.9.6/Python-3.9.6.tgz
RUN cd /data/ob-tool && tar xzf Python-3.9.6.tgz
RUN cd /data/ob-tool/Python-3.9.6 && ./configure --enable-optimizations && make altinstall
ADD ./ /data/ob-robot/
RUN cd /data/ob-robot/ && \
pip3.9 install -r /data/ob-robot/requirement.txt
RUN yum install -y git openssh-server
ENV GIT_SSH_COMMAND='ssh -o StrictHostKeyChecking=no -i /root/.ssh/id_rsa'
RUN yum install -y autoconf gettext && \
wget http://github.com/git/git/archive/v2.32.0.tar.gz && \
tar -xvf v2.32.0.tar.gz && \
rm -f v2.32.0.tar.gz && \
cd git-* && \
make configure && \
./configure --prefix=/usr && \
make -j16 && \
make install
WORKDIR /data/ob-robot
CMD if [ "$BOOT_MODE" = "app" ] ; then python3.9 main.py; fi

25
Documentation.md Normal file
View File

@ -0,0 +1,25 @@
## clone 仓库
- pip3 install -r requirement.txt
## 部署数据库
- 创建一个自己的database
- 仓库目录下的 sql/20240408.sql 文件已列出需要在数据库中创建的表结构
- 设置自己的数据库连接串在src/base/config.py文件内DB 变量的 'test_env'
## 启动服务
- python3 main.py
- 服务启动成功后查看API文档 [http://0.0.0.0:8000/docs](http://0.0.0.0:8000/docs)
- 历史日志文件记录在本地的 logs 目录下
## 环境变量说明
```python
# 同步任务执行完成后是否删除同步目录的环境变量
DELETE_SYNC_DIR = ('DELETE_SYNC_DIR', False)
# 是否在日志中详细记录git执行错误信息的环境变量
LOG_DETAIL = ('LOG_DETAIL', True)
# 设置同步目录的环境变量
SYNC_DIR = ("SYNC_DIR", "/tmp/sync_dir/")
```

51
LICENSE Normal file
View File

@ -0,0 +1,51 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
You must give any other recipients of the Work or Derivative Works a copy of this License; and
You must cause any modified files to carry prominent notices stating that You changed the files; and
You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.
You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS

37
Makefile Normal file
View File

@ -0,0 +1,37 @@
VERSION := $(shell git rev-parse --short HEAD)
SHELL=/bin/bash
CONDA_ACTIVATE=source $$(conda info --base)/etc/profile.d/conda.sh ; conda activate
# Image URL to use all building/pushing image targets
export IMAGE=reg.docker.alibaba-inc.com/ob-robot/reposyncer:v0.0.1
all: backend-docker frontend-docker
##@ docker
extra-download: ## git clone the extra reference
git submodule update --init
frontend-build:
cd web && $(MAKE) static
docker-build: ## Build docker image
docker build -t ${IMAGE} .
docker-push: ## Push docker image
docker push ${IMAGE}
backend-docker: extra-download frontend-build docker-build docker-push
##@ run
py39:
$(CONDA_ACTIVATE) py39
run: py39
python main.py
static:
cd web && $(MAKE) static
backrun:
nohup python main.py > /tmp/robot.log 2>&1 &

76
README-CN.md Normal file
View File

@ -0,0 +1,76 @@
# ob-repository-synchronize
## 描述
ob-repository-synchronize是一个帮助工程师进行多平台代码同步的小工具平台包括GitHubGiteeCodeChinaGitlink和内部仓库平台等等平台部分功能待完善。
## 原理
### 基于git rebase的多方向同步方案
<img src="doc/rebase.png" width="500" height="400">
### 基于git diff的单方向同步方案
<img src="doc/diff.png" width="500" height="400">
## 后端
### 依赖
name|version|necessity
--|:--:|--:
python|3.9|True
uvicorn|0.14.0|True
SQLAlchemy|1.4.21|True
fastapi|0.66.0|True
aiohttp|3.7.4|True
pydantic|1.8.2|True
starlette|0.14.2|True
aiomysql|0.0.21|True
requests|2.25.1|True
loguru|0.6.0|True
typing-extensions|4.1.1|True
aiofiles|0.8.0|True
### 如何安装
> [!NOTE]
> 运行代码必须在python 3.9环境下面
`pip3 install -r requirement.txt`
`python3 main.py`
### 在本地跑同步脚本
`python3 sync.py`
## 前端
[参考web下的readme](web/README.md)
## docker
`docker pull XXX:latest`
`docker run -p 8000:8000 -d XXX bash start.sh -s backend`
## 如何使用
1. 部署数据库
- 创建一个自己的database跑在sql文件夹下的table.sql文件
- 设置自己的数据库连接串在src/base/config.py文件内
2. 通过网页设置自己仓库地址同步分支和平台token待完善
<img src="doc/website.png" width="500" height="400">
3. 自适应配置自己的同步脚本请参考example下的两个例子然后运行自己的脚本在一个定时任务下面
应该考虑的一些内容:
- 仓库使用http链接还是ssh链接(如何把你自己的ssh key送入进来)
- 选择rebase还是diff逻辑
- 选择什么定时任务(或许是k8s cronjob或者是linux操作系统的crontab)

View File

@ -1,2 +1,75 @@
# reposync
# ob-repository-synchronize
## Description
ob-repository-synchronize is a small tool which can help engineer to master their open source production's code synchronization between GitHub, Gitee, CodeChina, internal repository and so on.
## Principle
### Base on git rebase
<img src="doc/rebase.png" width="500" height="400">
### Base on git diff
<img src="doc/diff.png" width="500" height="400">
## backend
### requirement
name|version|necessity
--|:--:|--:
python|3.9|True
uvicorn|0.14.0|True
SQLAlchemy|1.4.21|True
fastapi|0.66.0|True
aiohttp|3.7.4|True
pydantic|1.8.2|True
starlette|0.14.2|True
aiomysql|0.0.21|True
requests|2.25.1|True
loguru|0.6.0|True
typing-extensions|4.1.1|True
aiofiles|0.8.0|True
### how to install
> [!NOTE]
> Run the code in python 3.9
`pip3 install -r requirement.txt`
`python3 main.py`
### run the sync script locally
`python3 sync.py`
## frontend
[Refer the web readme](web/README.md)
## docker
`docker pull XXX:latest`
`docker run -p 8000:8000 -d XXX bash start.sh -s backend`
## How to use it
1. Config your database
- Run the table.sql script in sql folder
- Config the database connection string in src/base/config.py
2. Config your repo address, branch, (todo token) by website
<img src="doc/website.png" width="500" height="400">
3. DIY yourself sync script (Refer the two example in sync folder) and run the sync script under a cronjob
you should consider:
- http address or ssh address (how to add your ssh key)
- rebase logic or diff logic
- which cronjob (maybe the k8s cronjob or linux system crontab)

41
boot Normal file
View File

@ -0,0 +1,41 @@
#!/bin/bash
# usage:
# docker run -d --net=host -v /path/to/env.ini:/data/ob-robot/env.ini obrobot:1.0.0 ./start.sh -s backend
# docker run -d --net=host -v /path/to/env.ini:/data/ob-robot/env.ini obrobot:1.0.0 ./start.sh -s crontab
# init env
if [[ ! -f env.ini ]]; then
echo "env.ini missing"
exit 1
fi
source env.ini
usage()
{
echo "Usage:"
echo " start.sh -s <service>"
echo "Supported service: backend crontab "
echo "Default service is: backend"
exit 0
}
TEMP=`getopt -o s:h -- "$@"`
eval set -- "$TEMP"
while true ; do
case "$1" in
-h) usage; shift ;;
-s) service=$2; shift 2 ;;
--) shift; break;;
*) echo "Usupported option"; exit 1;;
esac
done
if [[ x"$service" == x"backend" ]]; then
# 启动后端服务
python3 main.py
else
echo "Unsupported service"
exit 1
fi

BIN
doc/diff.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 279 KiB

BIN
doc/rebase.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 282 KiB

BIN
doc/website.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 67 KiB

16
env.ini.example Normal file
View File

@ -0,0 +1,16 @@
export SYS_ENV=DEV
export LOG_PATH=
export LOG_LV=DEBUG
# 后端数据库配置
export CEROBOT_MYSQL_HOST=127.0.0.1
export CEROBOT_MYSQL_PORT=
export CEROBOT_MYSQL_USER=""
export CEROBOT_MYSQL_PWD=""
export CEROBOT_MYSQL_DB=
# 缓存数据库配置
# 运行构建任务容器名
export EL8_DOCKER_IMAGE=''
export EL7_DOCKER_IMAGE=''

9
extras/obfastapi/.gitignore vendored Normal file
View File

@ -0,0 +1,9 @@
nohup.out
*.pyc
*.pyo
build
dist
.vscode
.git
__pycache__
.idea/workspace.xml

View File

@ -0,0 +1,4 @@
FROM reg.docker.alibaba-inc.com/obvos/python3:3.9.2
COPY ./requirement.txt /tmp/requirement.txt
RUN /usr/local/bin/pip3.9 install -r /tmp/requirement.txt; rm -f /tmp/requirement.txt

View File

View File

@ -0,0 +1 @@
__VERSION__ = '1.2.3'

168
extras/obfastapi/config.py Normal file
View File

@ -0,0 +1,168 @@
from typing import Union, Dict, Any
from urllib import parse
__all__ = ["ConfigsUtil", "MysqlConfig"]
class ConfigsError(Exception):
pass
class Config:
def __hash__(self):
check_sum = 0
config = self.__dict__
for key in config:
check_sum += key.__hash__()
check_sum += getattr(config[key], '__hash__', lambda:0)()
return check_sum
def __eq__(self, value):
if isinstance(value, self.__class__):
return value.__hash__() == self.__hash__()
return False
class MysqlConfig(Config):
def __init__(self, host: str, port: int, dbname: str, user: str, passwd: str=None):
self.host = host
self.port = port
self.dbname = dbname
self.user = user
self.passwd = passwd
def get_url(self, drive: str="aiomysql", charset='utf8'):
user = parse.quote_plus(self.user)
if self.passwd:
user = '%s:%s' % (user, parse.quote_plus(self.passwd))
url = "mysql+%s://%s@%s:%s/%s" % (drive, user, self.host, self.port, self.dbname)
if charset:
url += "?charset=%s" % charset
return url
class RedisConfig(Config):
def __init__(
self,
host: str,
*,
port: int = 6379,
db: Union[str, int] = 0,
password: str = None,
socket_timeout: float = None,
socket_connect_timeout: float = None,
socket_keepalive: bool = None,
socket_keepalive_options: Dict[str, Any] = None,
unix_socket_path: str = None,
encoding: str = "utf-8",
encoding_errors: str = "strict",
decode_responses: bool = False,
retry_on_timeout: bool = False,
ssl: bool = False,
ssl_keyfile: str = None,
ssl_certfile: str = None,
ssl_cert_reqs: str = "required",
ssl_ca_certs: str = None,
ssl_check_hostname: bool = False,
max_connections: int = 0,
single_connection_client: bool = False,
health_check_interval: int = 0,
client_name: str = None,
username: str = None
):
self.host: str = host
self.port: int = port
self.db: Union[str, int] = db
self.password: str = password
self.socket_timeout: float = socket_timeout
self.socket_connect_timeout: float = socket_connect_timeout
self.socket_keepalive: bool = socket_keepalive
self.socket_keepalive_options: Dict[str, Any] = socket_keepalive_options
self.unix_socket_path: str = unix_socket_path
self.encoding: str = encoding
self.encoding_errors: str = encoding_errors
self.decode_responses: bool = decode_responses
self.retry_on_timeout: bool = retry_on_timeout
self.ssl: bool = ssl
self.ssl_keyfile: str = ssl_keyfile
self.ssl_certfile: str = ssl_certfile
self.ssl_cert_reqs: str = ssl_cert_reqs
self.ssl_ca_certs: str = ssl_ca_certs
self.ssl_check_hostname: bool = ssl_check_hostname
self.max_connections: int = max_connections
self.single_connection_client: bool = single_connection_client
self.health_check_interval: int = health_check_interval
self.client_name: str = client_name
self.username: str = username
@property
def config(self) -> Dict:
return self.__dict__
def __hash__(self):
check_sum = 0
config = self.config
for key in config:
check_sum += key.__hash__()
check_sum += getattr(config[key], '__hash__', lambda:0)()
return check_sum
class ObFastApi(Config):
def __init__(self, buc_key: str = "OBVOS_USER_SIGN", log_name: str = 'obfastapi', log_path: str = None, log_level: str = "INFO", log_interval: int = 1, log_count: int = 7):
self.buc_key = buc_key
self.log_name = log_name
self.log_path = log_path
self.log_level = log_level
self.log_interval = log_interval
self.log_count = log_count
class ConfigsUtil:
MYSQL: Dict[str, MysqlConfig] = {}
REDIS: Dict[str, RedisConfig] = {}
OB_FAST_API = ObFastApi()
@staticmethod
def get_config(configs: Dict[str, Config], key:str) -> Config:
config = configs.get(key)
if not config:
raise ConfigsError('Nu such config %s' % key)
return config
@staticmethod
def set_config(configs: Dict[str, Config], key:str, config: Config):
configs[key] = config
@classmethod
def get_mysql_config(cls, key: str) -> MysqlConfig:
return cls.get_config(cls.MYSQL, key)
@classmethod
def set_mysql_config(cls, key: str, config: MysqlConfig):
cls.set_config(cls.MYSQL, key, config)
@classmethod
def get_redis_config(cls, key: str) -> RedisConfig:
return cls.get_config(cls.REDIS, key)
@classmethod
def set_redis_config(cls, key: str, config: RedisConfig):
cls.set_config(cls.REDIS, key, config)
@classmethod
def get_obfastapi_config(cls, key: str):
return getattr(cls.OB_FAST_API, key.lower(), '')
@classmethod
def set_obfastapi_config(cls, key: str, value: Any):
setattr(cls.OB_FAST_API, key.lower(), value)

1113
extras/obfastapi/frame.py Normal file

File diff suppressed because it is too large Load Diff

221
extras/obfastapi/log.py Normal file
View File

@ -0,0 +1,221 @@
import re
import os
import sys
import logging
from logging import handlers
class LogRecord(logging.LogRecord):
def __init__(self, name, level, pathname, lineno, msg, args, exc_info, func, sinfo):
super().__init__(name, level, pathname, lineno, msg, args, exc_info, func, sinfo)
try:
self.package = os.path.split(os.path.dirname(pathname))[1]
except (TypeError, ValueError, AttributeError):
self.package = "Unknown package"
class StreamHandler(logging.StreamHandler):
def emit(self, record: logging.LogRecord):
try:
msg = self.format(record)
stream = self.stream
stream.write(msg + self.terminator)
if stream != sys.stderr:
print (msg)
self.flush()
except RecursionError:
raise
except Exception:
self.handleError(record)
class TimedRotatingFileHandler(handlers.TimedRotatingFileHandler):
def emit(self, record: logging.LogRecord):
try:
if self.shouldRollover(record):
self.doRollover()
if self.stream is None:
self.stream = self._open()
StreamHandler.emit(self, record)
except Exception:
self.handleError(record)
class Formatter(logging.Formatter):
def __init__(
self,
show_asctime: bool = True,
show_level: bool = True,
show_logger_name: bool = True,
show_path: bool = False,
show_file_name: bool = True,
show_line_no: bool = True,
show_func_name: bool = True,
datefmt: str = "%Y-%m-%d %H:%M:%S.%03f"
):
match = re.match('.*([^a-zA-Z]*%(\d*)f)$', datefmt)
if match:
groups = match.groups()
datefmt = datefmt[:-len(groups[0])]
time_str = '[%(asctime)s%(msecs)' + groups[1] + 'd] '
else:
time_str = '[%(asctime)s] '
fmt = '%(message)s'
if show_path:
trace_info = '%(pathname)s'
elif show_file_name:
trace_info = '%(package)s/%(filename)s'
else:
trace_info = ''
if trace_info:
if show_line_no:
trace_info += ':%(lineno)d'
fmt = '(%s) %s' % (trace_info, fmt)
if show_func_name:
fmt = '%(funcName)s ' + fmt
if show_logger_name:
fmt = '[%(name)s] ' + fmt
if show_level:
fmt = '%(levelname)s ' + fmt
if show_asctime:
fmt = time_str + fmt
super().__init__(fmt, datefmt, style='%', validate=True)
DEFAULT_HANDLER = StreamHandler(None)
DEFAULT_FORMATTER = Formatter()
DEFAULT_LEVEL = 'WARN'
DEFAULT_PATH = None
DEFAULT_INTERVAL = 1
DEFAULT_BACKUP_COUNT = 7
class OBLogger(logging.Logger):
def __init__(self, name: str, level: str = DEFAULT_LEVEL, path: str = DEFAULT_PATH, interval: int = DEFAULT_INTERVAL, backup_count: int = DEFAULT_BACKUP_COUNT, formatter: Formatter = DEFAULT_FORMATTER):
super().__init__(name, level)
self.handlers = []
self._interval = interval
self._backup_count = backup_count
self._formatter = formatter
self._path = self._format_path(path) if path else None
self._default_handler = None
self._create_file_handler()
@property
def interval(self):
return self._interval
@property
def backup_count(self):
return self._backup_count
@property
def formatter(self):
return self._formatter
@property
def path(self):
return self._path
@interval.setter
def interval(self, interval: int):
if interval != self._interval:
self._interval = interval
self._create_file_handler()
@backup_count.setter
def backup_count(self, backup_count: int):
if backup_count != self._backup_count:
self._backup_count = backup_count
self._create_file_handler()
@formatter.setter
def formatter(self, formatter: Formatter):
if formatter != self._formatter:
self._formatter = formatter
self._create_file_handler()
@path.setter
def path(self, path):
path = self._format_path(path) if path else None
if path and path != self._path:
self._path = path
self._create_file_handler()
def _create_file_handler(self):
if self._default_handler:
self.removeHandler(self._default_handler)
if self.path:
self._default_handler = TimedRotatingFileHandler(self.path, when='midnight', interval=self.interval, backupCount=self.backup_count)
else:
self._default_handler = DEFAULT_HANDLER
self._default_handler.setFormatter(self.formatter)
self.addHandler(self._default_handler)
def _format_path(self, path: str):
return path % self.__dict__
class LoggerFactory(object):
LOGGERS = logging.Logger.manager.loggerDict
GLOBAL_CONFIG = {}
@classmethod
def init(cls):
if logging.getLoggerClass() != OBLogger:
logging.setLoggerClass(OBLogger)
logging.setLogRecordFactory(LogRecord)
# logging.basicConfig()
cls.update_global_config()
@classmethod
def update_global_config(cls, level: str = DEFAULT_LEVEL, path: str = DEFAULT_PATH, interval: int = DEFAULT_INTERVAL, backup_count: int = DEFAULT_BACKUP_COUNT, formatter: Formatter = DEFAULT_FORMATTER):
args = locals()
updates = {}
for key in args:
value = args[key]
if value != cls.GLOBAL_CONFIG.get(key):
cls.GLOBAL_CONFIG[key] = updates[key] = value
update_path = updates.get(path)
if updates:
for name in cls.LOGGERS:
logger = cls.LOGGERS[name]
if not isinstance(logger, logging.Logger):
continue
for key in updates:
if key == 'level':
logger.setLevel(updates[key])
else:
setattr(logger, key, updates[key])
if update_path and not isinstance(logger, OBLogger):
logger.handlers = [logger.handlers.append(TimedRotatingFileHandler(path, when='midnight', interval=interval, backupCount=backup_count))]
@classmethod
def create_logger(cls, name: str, level: str = DEFAULT_LEVEL, path: str = DEFAULT_PATH, interval: int = DEFAULT_INTERVAL, backup_count: int = DEFAULT_BACKUP_COUNT, formatter: Formatter = DEFAULT_FORMATTER):
if name in cls.LOGGERS:
raise Exception('Logger `%s` has been created' % name)
args = locals()
logging._acquireLock()
logger = logging.getLogger(name)
cls.LOGGERS[name] = logger
logging._releaseLock()
return logger
@classmethod
def get_logger(cls, name: str):
logger = cls.LOGGERS.get(name)
if logger is None:
logger = cls.create_logger(name)
return logger
LoggerFactory.init()

128
extras/obfastapi/mysql.py Normal file
View File

@ -0,0 +1,128 @@
import sys
from typing_extensions import Self
from .log import LoggerFactory
from .config import ConfigsUtil, MysqlConfig
Logger = LoggerFactory.create_logger(
name='sqlalchemy.engine',
level=ConfigsUtil.get_obfastapi_config('log_level'),
path=ConfigsUtil.get_obfastapi_config('log_path'),
interval=ConfigsUtil.get_obfastapi_config('log_interval'),
backup_count=ConfigsUtil.get_obfastapi_config('log_count')
)
from sqlalchemy.dialects.mysql.base import MySQLDialect
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.ext.asyncio import create_async_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm.session import Session
from sqlalchemy.exc import DatabaseError
__all__ = ('aiomysql_session', 'AIOMysqlSessionMakerFactory', 'OBDataBaseError')
OBDataBaseError = DatabaseError
def _get_server_version_info(self, connection):
# get database server version info explicitly over the wire
# to avoid proxy servers like MaxScale getting in the
# way with their own values, see #4205
dbapi_con = connection.connection
cursor = dbapi_con.cursor()
cursor.execute("show global variables like 'version_comment'")
val = cursor.fetchone()
if val and 'OceanBase' in val[1]:
val = '5.6.0'
else:
cursor.execute("SELECT VERSION()")
val = cursor.fetchone()[0]
cursor.close()
from sqlalchemy import util
if util.py3k and isinstance(val, bytes):
val = val.decode()
return self._parse_server_version(val)
setattr(MySQLDialect, '_get_server_version_info', _get_server_version_info)
class ConfigKey:
def __init__(self, **config):
check_sum = 0
for key in config:
check_sum += key.__hash__()
check_sum += getattr(config[key], '__hash__', lambda: 0)()
self.__hash = check_sum
def __hash__(self):
return self.__hash
def __eq__(self, value):
if isinstance(value, self.__class__):
return value.__hash__() == self.__hash__()
return False
class ORMAsyncExplicitTransactionHolder():
def __init__(self, session: AsyncSession):
self.session = session
async def __aenter__(self):
await self.session.execute('BEGIN')
async def __aexit__(self, exc_type, exc_val, exc_tb):
if exc_val is None:
await self.session.commit()
else:
await self.session.rollback()
raise exc_val
class ORMAsyncSession(AsyncSession, Session):
async def __aenter__(self) -> Self:
await super().__aenter__()
return self
def begin(self) -> ORMAsyncExplicitTransactionHolder:
return ORMAsyncExplicitTransactionHolder(self)
class AIOMysqlSessionMakerFactory:
_SESSIONS_MAKER = {}
@classmethod
def get_instance(cls, key: str, **kwargs) -> ORMAsyncSession:
config = ConfigsUtil.get_mysql_config(key)
config_key = ConfigKey(__config__=config, **kwargs)
if config_key not in cls._SESSIONS_MAKER:
cls._SESSIONS_MAKER[config_key] = cls.create_instance(config, **kwargs)
return cls._SESSIONS_MAKER[config_key]
@classmethod
def create_instance(cls, config: MysqlConfig, **kwargs) -> ORMAsyncSession:
engine = create_async_engine(config.get_url(), **kwargs)
return sessionmaker(engine, autocommit=False, expire_on_commit=False, class_=ORMAsyncSession)
def aiomysql_session(
key: str,
max_overflow: int = 20,
pool_size: int = 10,
pool_timeout: int = 5,
pool_recycle: int = 28800,
echo: bool = False,
**kwargs
) -> ORMAsyncSession:
return AIOMysqlSessionMakerFactory.get_instance(
key,
max_overflow=max_overflow,
pool_size=pool_size,
pool_timeout=pool_timeout,
pool_recycle=pool_recycle,
echo=echo,
**kwargs
)()

27
extras/obfastapi/redis.py Normal file
View File

@ -0,0 +1,27 @@
from copy import deepcopy
from typing import Dict
import asyncio
from aioredis.client import Redis
from .config import ConfigsUtil, RedisConfig
__all__ = ('RedisConnectionPoolFactory')
class RedisConnectionPoolFactory:
_POOLS: Dict[RedisConfig, Redis] = {}
@classmethod
def get_instance(cls, key: str) -> Redis:
config = ConfigsUtil.get_redis_config(key)
config = deepcopy(config)
if config not in cls._POOLS:
cls._POOLS[config] = cls.create_instance(config)
return cls._POOLS[config]
@classmethod
def create_instance(cls, config: RedisConfig) -> Redis:
return Redis(**config.config)

View File

@ -0,0 +1,10 @@
uvicorn==0.14.0
SQLAlchemy==1.4.21
fastapi==0.65.2
aiohttp==3.7.4.post0
pydantic==1.8.2
starlette==0.14.2
aiomysql==0.0.21
aioredis==2.0.0
requests==2.25.1
typing_extensions==4.1.1

259
extras/obfastapi/rpc.py Normal file
View File

@ -0,0 +1,259 @@
import inspect
import functools
from typing import Optional, Mapping, Any, Union
from enum import Enum
import aiohttp
from aiohttp.typedefs import LooseHeaders, StrOrURL, JSONDecoder, DEFAULT_JSON_DECODER
from .log import LoggerFactory
from .config import ConfigsUtil
Logger = LoggerFactory.create_logger(
name = '%s.rpc' % ConfigsUtil.get_obfastapi_config('log_name'),
level = ConfigsUtil.get_obfastapi_config('log_level'),
path = ConfigsUtil.get_obfastapi_config('log_path'),
interval = ConfigsUtil.get_obfastapi_config('log_interval'),
backup_count = ConfigsUtil.get_obfastapi_config('log_count')
)
__all__ = ("RPCResponse", "RPCService", "RPCServiceCenter")
def iscoroutinefunction_or_partial(obj: Any) -> bool:
"""
Correctly determines if an object is a coroutine function,
including those wrapped in functools.partial objects.
"""
while isinstance(obj, functools.partial):
obj = obj.func
return inspect.iscoroutinefunction(obj)
DEFAULT_TIMEOUT = aiohttp.ClientTimeout(3 * 60)
class RPCResponse:
def __init__(self, status_code: int, text: str):
self.status_code = status_code
self.text = text
def json(self, loads: JSONDecoder = DEFAULT_JSON_DECODER) -> Any:
stripped = self.text.strip() # type: ignore
if not stripped:
return None
return loads(stripped)
class RPCService:
def __init__(self, host: str, headers: LooseHeaders={}):
self._host = host
self._headers = headers
@property
def host(self):
return self._host
@property
def headers(self):
return self._headers
async def get(
self,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
return await self.request("GET", url, params=params, data=data, json=json, headers=headers, encoding=encoding, timeout=timeout, **kwargs)
async def options(
self,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
return await self.request("OPTIONS", url, params=params, data=data, json=json, headers=headers, encoding=encoding, timeout=timeout, **kwargs)
async def head(
self,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
return await self.request("HEAD", url, params=params, data=data, json=json, headers=headers, encoding=encoding, timeout=timeout, **kwargs)
async def post(
self,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
return await self.request("POST", url, params=params, data=data, json=json, headers=headers, encoding=encoding, timeout=timeout, **kwargs)
async def put(
self,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
return await self.request("PUT", url, params=params, data=data, json=json, headers=headers, encoding=encoding, timeout=timeout, **kwargs)
async def patch(
self,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
return await self.request("PATCH", url, params=params, data=data, json=json, headers=headers, encoding=encoding, timeout=timeout, **kwargs)
async def delete(
self,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
return await self.request("DELETE", url, params=params, data=data, json=json, headers=headers, encoding=encoding, timeout=timeout, **kwargs)
async def request(
self,
method: str,
url: StrOrURL,
*,
params: Optional[Mapping[str, str]] = None,
data: Any = None,
json: Any = None,
headers: Optional[LooseHeaders] = None,
encoding: Optional[str] = None,
timeout: Union[aiohttp.ClientTimeout, int] = DEFAULT_TIMEOUT,
**kwargs
) -> RPCResponse:
if not url.startswith(self._host):
url = "%s/%s" % (self._host, url)
if headers:
if self._headers:
headers.update(self._headers)
else:
headers = self._headers
if isinstance(timeout, int):
timeout = aiohttp.ClientTimeout(total=timeout)
Logger.debug('request %s params %s data %s json %s headers %s timeout %s' % (url, params, data, json, headers, timeout))
async with aiohttp.request(method.upper(), url, params=params, data=data, json=json, headers=headers, timeout=timeout, **kwargs) as resp:
text = await resp.text(encoding=encoding)
Logger.debug('response code %s, text %s' % (resp.status, text))
return RPCResponse(
status_code=resp.status,
text=text
)
class RPCServiceError(Exception):
pass
class RPCServiceCenter:
_SERVICES = {}
@classmethod
def register(cls, name: str, *arg, **kwargs):
"""
example:
# register AService
@RPCServiceCenter.register("service_a", host="http://127.1")
class AService(RPCService):
def get_host(self):
return self.host
"""
def decorator(clz):
if name not in cls._SERVICES:
cls._SERVICES[name] = clz(*arg, **kwargs)
else:
raise RPCServiceError("'%s' is already registered by %s" % (name, cls._SERVICES[name].__class__))
return clz
return decorator
@classmethod
def call(cls, service_name: str, param_name: Optional[str]=None):
"""
example:
# example one: call AService
@RPCServiceCenter.call("service_a")
def call_aservice(service_a: AService):
print (service_a.get_host())
# example two: call AService
@RPCServiceCenter.call("service_a", "a_service")
def call_service_a(a_service: AService):
print (a_service.get_host())
params:
service_name: service name registered in RPCServiceCenter
param_name: name of service object in function
"""
def decorator(func):
def component(*arg, **kwargs):
kwargs[param_name] = cls._SERVICES[service_name]
return func(*arg, **kwargs)
async def async_component(*arg, **kwargs):
kwargs[param_name] = cls._SERVICES[service_name]
return await func(*arg, **kwargs)
if service_name not in cls._SERVICES:
raise RPCServiceError("No such service '%s'" % service_name)
return async_component if iscoroutinefunction_or_partial(func) else component
if param_name is None:
param_name = service_name
return decorator

33
main.py Normal file
View File

@ -0,0 +1,33 @@
# coding: utf-8
import uvicorn
import src.api.Cerobot
import src.api.Sync
import src.api.Account
import src.api.PullRequest
import src.api.User
import src.api.Log
import src.api.Auth
import src.api.Sync_config
from extras.obfastapi.frame import OBFastAPI
from src.router import CE_ROBOT, PROJECT, JOB, ACCOUNT, PULL_REQUEST, USER, LOG, AUTH, SYNC_CONFIG
from fastapi.staticfiles import StaticFiles
app = OBFastAPI()
app.include_router(CE_ROBOT)
app.include_router(PROJECT)
app.include_router(JOB)
app.include_router(ACCOUNT)
app.include_router(PULL_REQUEST)
app.include_router(USER)
app.include_router(LOG)
app.include_router(AUTH)
app.include_router(SYNC_CONFIG)
app.mount("/", StaticFiles(directory="web/dist"), name="static")
if __name__ == '__main__':
# workers 参数仅在命令行使用uvicorn启动时有效 或使用环境变量 WEB_CONCURRENCY
uvicorn.run(app='main:app', host='0.0.0.0', port=8000,
reload=True, debug=True, workers=2)

11
requirement.txt Normal file
View File

@ -0,0 +1,11 @@
uvicorn==0.14.0
SQLAlchemy==1.4.21
fastapi==0.66.0
aiohttp==3.7.4.post0
pydantic==1.8.2
starlette==0.14.2
aiomysql==0.0.21
requests==2.26.0
loguru==0.6.0
typing-extensions==4.1.1
aiofiles==0.8.0

28
script/backend-conf.yaml Normal file
View File

@ -0,0 +1,28 @@
apiVersion: v1
kind: ConfigMap
metadata:
name: reposyncer-conf
namespace: reposyncer
labels:
name: reposyncer-conf
data:
env.ini: |
export SYS_ENV=DEV
export LOG_PATH=
export LOG_LV=DEBUG
# 后端数据库配置
export CEROBOT_MYSQL_HOST=
export CEROBOT_MYSQL_PORT=
export CEROBOT_MYSQL_USER=""
export CEROBOT_MYSQL_PWD=""
export CEROBOT_MYSQL_DB=""
# 对称加密密钥
export DATA_ENCRYPT_KEY=
# 运行构建任务容器名
export EL8_DOCKER_IMAGE=''
export EL7_DOCKER_IMAGE=''
# authentication
export BUC_KEY=OBRDE_DEV_USER_SIGN

View File

@ -0,0 +1,13 @@
apiVersion: v1
kind: Service
metadata:
namespace: reposyncer-test
name: reposyncer-test-backend
labels:
k8s-app: reposyncer-test-backend
spec:
ports:
- port: 80
targetPort: 8000
selector:
k8s-app: reposyncer-test-backend

40
script/backend.yaml Normal file
View File

@ -0,0 +1,40 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: reposyncer-test-backend
namespace: reposyncer-test
spec:
selector:
matchLabels:
k8s-app: reposyncer-test-backend
replicas: 1
template:
metadata:
labels:
k8s-app: reposyncer-test-backend
spec:
containers:
- name: reposyncer
image: #/ob-robot/reposyncer:v0.0.1
imagePullPolicy: Always
ports:
- containerPort: 8000
env:
- name: BOOT_MODE
value: "app"
- name: WEB_CONCURRENCY
value: "4"
- name: SYS_ENV
value: "DEV"
- name: CEROBOT_MYSQL_HOST
value: ""
- name: CEROBOT_MYSQL_PORT
value: ""
- name: CEROBOT_MYSQL_USER
value: ""
- name: CEROBOT_MYSQL_PWD
value: ""
- name: CEROBOT_MYSQL_DB
value: ""
- name: BUC_KEY
value: "OBRDE_DEV_USER_SIGN"

View File

@ -0,0 +1,13 @@
kind: Service
apiVersion: v1
metadata:
name: ob-robot-frontend
namespace: ob-robot
labels:
k8s-app: ob-robot-frontend
spec:
selector:
k8s-app: ob-robot-frontend
ports:
- port: 80
targetPort: 8080

21
script/frontend.yaml Normal file
View File

@ -0,0 +1,21 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: ob-robot-frontend
namespace: ob-robot
spec:
replicas: 1
selector:
matchLabels:
k8s-app: ob-robot-frontend
template:
metadata:
labels:
k8s-app: ob-robot-frontend
spec:
containers:
- name: ob-robot-frontend
image: #/ob-robot/frontend:v0.0.14
imagePullPolicy: Always
ports:
- containerPort: 8080

4
script/namespace.yaml Normal file
View File

@ -0,0 +1,4 @@
apiVersion: v1
kind: Namespace
metadata:
name: reposyncer-test

49
script/sync-cronjob.yaml Normal file
View File

@ -0,0 +1,49 @@
apiVersion: batch/v1beta1
kind: CronJob
metadata:
name: document-sync
namespace: ob-robot
spec:
concurrencyPolicy: Forbid
schedule: "*/10 * * * *"
jobTemplate:
spec:
template:
spec:
volumes:
- name: ssh-key-volume
secret:
secretName: my-ssh-key
defaultMode: 256
containers:
- name: document-sync
image: #/ob-robot/reposyncer:v0.0.1
imagePullPolicy: IfNotPresent
command:
- python3
- sync.py
env:
- name: BOOT_MODE
value: "sync"
- name: WEB_CONCURRENCY
value: "4"
- name: SYS_ENV
value: "DEV"
- name: CEROBOT_MYSQL_HOST
value: ""
- name: CEROBOT_MYSQL_PORT
value: ""
- name: CEROBOT_MYSQL_USER
value: ""
- name: CEROBOT_MYSQL_PWD
value: ""
- name: CEROBOT_MYSQL_DB
value: ""
- name: BUC_KEY
value: "OBRDE_DEV_USER_SIGN"
- name: SYS_ENV
value: "DEV"
volumeMounts:
- name: ssh-key-volume
mountPath: "/root/.ssh/"
restartPolicy: OnFailure

41
sql/20240408.sql Normal file
View File

@ -0,0 +1,41 @@
--
CREATE TABLE IF NOT EXISTS `sync_repo_mapping` (
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`repo_name` varchar(255) NOT NULL COMMENT '仓库名称',
`enable` tinyint(1) NOT NULL COMMENT '同步状态',
`internal_repo_address` varchar(255) NOT NULL COMMENT '内部仓库地址',
`external_repo_address` varchar(255) NOT NULL COMMENT '外部仓库地址',
`sync_granularity` enum('all', 'one') NOT NULL COMMENT '同步粒度',
`sync_direction` enum('to_outer', 'to_inter') NOT NULL COMMENT '首次同步方向',
`created_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '仓库绑定时间',
PRIMARY KEY (`id`),
UNIQUE KEY (`repo_name`)
) DEFAULT CHARACTER SET = utf8mb4 COMMENT = '同步仓库映射表';
--
CREATE TABLE IF NOT EXISTS `sync_branch_mapping`(
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`repo_id` bigint unsigned NOT NULL COMMENT '仓库ID',
`enable` tinyint(1) NOT NULL COMMENT '同步状态',
`internal_branch_name` varchar(255) NOT NULL COMMENT '内部仓库分支名称',
`external_branch_name` varchar(255) NOT NULL COMMENT '外部仓库分支名称',
`sync_direction` enum('to_outer', 'to_inter') NOT NULL COMMENT '首次同步方向',
`created_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '分支绑定时间',
PRIMARY KEY (`id`)
) DEFAULT CHARACTER SET = utf8mb4 COMMENT = '同步分支映射表';
--
CREATE TABLE IF NOT EXISTS `repo_sync_log`(
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`branch_id` bigint unsigned COMMENT '分支id',
`repo_name` varchar(255) NOT NULL COMMENT '仓库名称',
`commit_id` varchar(255) COMMENT 'commit ID',
`log` longtext COMMENT '同步日志',
`sync_direct` enum('to_outer', 'to_inter') NOT NULL COMMENT '同步方向',
`created_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',
`update_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间',
PRIMARY KEY (`id`)
) DEFAULT CHARACTER SET = utf8mb4 COMMENT = '同步日志';

71
sql/alter.sql Normal file
View File

@ -0,0 +1,71 @@
ALTER TABLE `github_account`
MODIFY COLUMN `id` bigint unsigned AUTO_INCREMENT,
MODIFY COLUMN `domain` varchar(20) COMMENT '域账号',
MODIFY COLUMN `nickname` varchar(20) COMMENT '花名',
MODIFY COLUMN `account` varchar(50) COMMENT 'GitHub账号',
MODIFY COLUMN `email` varchar(50) COMMENT '邮箱',
MODIFY COLUMN `create_time` DATETIME COMMENT '创建时间',
MODIFY COLUMN `update_time` DATETIME COMMENT '更新时间';
ALTER TABLE `gitee_account`
MODIFY COLUMN `id` bigint unsigned AUTO_INCREMENT,
MODIFY COLUMN `domain` varchar(20) COMMENT '域账号',
MODIFY COLUMN `nickname` varchar(20) COMMENT '花名',
MODIFY COLUMN `account` varchar(20) COMMENT 'GitHub账号',
MODIFY COLUMN `email` varchar(20) COMMENT '邮箱',
MODIFY COLUMN `create_time` DATETIME COMMENT '创建时间',
MODIFY COLUMN `update_time` DATETIME COMMENT '更新时间';
ALTER TABLE `sync_project`
MODIFY COLUMN `id` bigint unsigned AUTO_INCREMENT,
MODIFY COLUMN `name` varchar(50) COMMENT '名称',
MODIFY COLUMN `github` varchar(100) COMMENT 'GitHub地址',
MODIFY COLUMN `gitee` varchar(100) COMMENT 'Gitee地址',
MODIFY COLUMN `gitlab` varchar(100) COMMENT 'Gitlab地址',
MODIFY COLUMN `code_china` varchar(100) COMMENT 'CodeChina地址',
MODIFY COLUMN `gitlink` varchar(100) COMMENT 'Gitlink地址',
MODIFY COLUMN `github_token` varchar(100) COMMENT 'GitHub token',
MODIFY COLUMN `gitee_token` varchar(100) COMMENT 'Gitee token',
MODIFY COLUMN `code_china_token` varchar(100) COMMENT 'CodeChina token',
MODIFY COLUMN `gitlink_token` varchar(100) COMMENT 'Gitlink token',
MODIFY COLUMN `create_time` DATETIME COMMENT '创建时间',
MODIFY COLUMN `update_time` DATETIME COMMENT '更新时间';
ALTER TABLE `sync_job`
MODIFY COLUMN `id` bigint unsigned AUTO_INCREMENT,
MODIFY COLUMN `project` varchar(50) COMMENT '工程名称',
MODIFY COLUMN `type` enum('OneWay','TwoWay') COMMENT '同步类型',
MODIFY COLUMN `status` tinyint(1) COMMENT '同步流状态',
MODIFY COLUMN `github_branch` varchar(50) COMMENT 'GitHub分支',
MODIFY COLUMN `gitee_branch` varchar(50) COMMENT 'Gitee分支',
MODIFY COLUMN `gitlab_branch` varchar(50) COMMENT 'Gitlab分支',
MODIFY COLUMN `code_china_branch` varchar(50) COMMENT 'CodeChina分支',
MODIFY COLUMN `gitlink_branch` varchar(50) COMMENT 'Gitlink分支',
MODIFY COLUMN `create_time` DATETIME COMMENT '创建时间',
MODIFY COLUMN `update_time` DATETIME COMMENT '更新时间',
MODIFY COLUMN `commit` varchar(50) COMMENT '最新commit';
ALTER TABLE `pull_request`
MODIFY COLUMN `id` bigint unsigned AUTO_INCREMENT,
MODIFY COLUMN `pull_request_id` bigint unsigned COMMENT 'pull request id',
MODIFY COLUMN `title` text COMMENT 'title',
MODIFY COLUMN `project` varchar(20) COMMENT '工程名称',
MODIFY COLUMN `type` enum('GitHub','Gitee','Gitlab','Gitcode','Gitlink') COMMENT '仓库类型',
MODIFY COLUMN `address` varchar(100) COMMENT 'pull request详情页地址',
MODIFY COLUMN `author` varchar(20) COMMENT '作者',
MODIFY COLUMN `email` varchar(50) COMMENT '邮箱',
MODIFY COLUMN `target_branch` varchar(50) COMMENT '目标分支',
MODIFY COLUMN `inline` tinyint(1) COMMENT '是否推送内部',
MODIFY COLUMN `latest_commit` varchar(50) COMMENT '最新的commit',
MODIFY COLUMN `create_time` DATETIME COMMENT '创建时间',
MODIFY COLUMN `update_time` DATETIME COMMENT '更新时间';
ALTER TABLE `sync_log`
MODIFY COLUMN `id` bigint unsigned AUTO_INCREMENT,
MODIFY COLUMN `sync_job_id` bigint unsigned COMMENT '同步工程id',
MODIFY COLUMN `log_type` varchar(20) COMMENT '单条日志类型',
MODIFY COLUMN `log` text COMMENT 'title',
MODIFY COLUMN `create_time` DATETIME COMMENT '创建时间';
ALTER TABLE `sync_log` add INDEX idx_sync_log_job_id(sync_job_id);
ALTER TABLE `sync_job` add INDEX idx_sync_job_project(project);

84
sql/table.sql Normal file
View File

@ -0,0 +1,84 @@
CREATE TABLE `github_account` (
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`domain` varchar(20) NOT NULL COMMENT '域账号',
`nickname` varchar(20) DEFAULT NULL COMMENT '花名',
`account` varchar(50) DEFAULT NULL COMMENT 'GitHub账号',
`email` varchar(50) DEFAULT NULL COMMENT '邮箱',
`create_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',
`update_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间',
PRIMARY KEY (`id`)
);
CREATE TABLE `gitee_account` (
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`domain` varchar(20) NOT NULL COMMENT '域账号',
`nickname` varchar(20) DEFAULT NULL COMMENT '花名',
`account` varchar(20) DEFAULT NULL COMMENT 'GitHub账号',
`email` varchar(20) DEFAULT NULL COMMENT '邮箱',
`create_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',
`update_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间',
PRIMARY KEY (`id`)
);
CREATE TABLE `sync_project` (
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`name` varchar(50) NOT NULL COMMENT '名称',
`github` varchar(100) DEFAULT NULL COMMENT 'GitHub地址',
`gitlab` varchar(100) DEFAULT NULL COMMENT 'Gitlab地址',
`gitee` varchar(100) DEFAULT NULL COMMENT 'Gitee地址',
`code_china` varchar(100) DEFAULT NULL COMMENT 'CodeChina地址',
`gitlink` varchar(100) DEFAULT NULL COMMENT 'Gitlink地址',
`github_token` varchar(100) DEFAULT NULL COMMENT 'GitHub token',
`gitee_token` varchar(100) DEFAULT NULL COMMENT 'Gitee token',
`code_china_token` varchar(100) DEFAULT NULL COMMENT 'CodeChina token',
`gitlink_token` varchar(100) DEFAULT NULL COMMENT 'Gitlink token',
`create_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',
`update_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间',
PRIMARY KEY (`id`)
);
CREATE TABLE `sync_job` (
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`project` varchar(50) NOT NULL COMMENT '工程名称',
`type` enum('OneWay','TwoWay') NOT NULL COMMENT '同步类型',
`status` tinyint(1) NOT NULL DEFAULT FALSE COMMENT '同步流状态',
`github_branch` varchar(50) DEFAULT NULL COMMENT 'GitHub分支',
`gitee_branch` varchar(50) DEFAULT NULL COMMENT 'Gitee分支',
`gitlab_branch` varchar(50) DEFAULT NULL COMMENT 'Gitlab分支',
`code_china_branch` varchar(50) DEFAULT NULL COMMENT 'CodeChina分支',
`gitlink_branch` varchar(50) DEFAULT NULL COMMENT 'Gitlink分支',
`base` enum('GitHub','Gitee','Gitlab','Gitcode','Gitlink') DEFAULT NULL COMMENT '基础仓库',
`create_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',
`update_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间',
`commit` varchar(50) NOT NULL COMMENT '最新commit',
PRIMARY KEY (`id`)
);
CREATE TABLE `pull_request` (
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`pull_request_id` bigint unsigned NOT NULL COMMENT 'pull request id',
`title` text NOT NULL COMMENT 'title',
`project` varchar(20) NOT NULL COMMENT '工程名称',
`type` enum('GitHub','Gitee','Gitlab','Gitcode','Gitlink') NOT NULL COMMENT '仓库类型',
`address` varchar(100) NOT NULL COMMENT 'pull request详情页地址',
`author` varchar(20) NOT NULL COMMENT '作者',
`email` varchar(50) NOT NULL COMMENT '邮箱',
`target_branch` varchar(50) NOT NULL COMMENT '目标分支',
`inline` tinyint(1) NOT NULL DEFAULT FALSE COMMENT '是否推送内部',
`latest_commit` varchar(50) NOT NULL COMMENT '最新的commit',
-- `code_review_address` varchar(50) COMMENT 'code review地址',
`create_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',
`update_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '更新时间',
PRIMARY KEY (`id`)
);
CREATE TABLE `sync_log` (
`id` bigint unsigned NOT NULL AUTO_INCREMENT,
`sync_job_id` bigint unsigned NOT NULL COMMENT '同步工程id',
`commit` varchar(50) DEFAULT NULL COMMENT 'commit',
`pull_request_id` bigint unsigned DEFAULT NULL COMMENT 'pull request id',
`log_type` varchar(20) NOT NULL COMMENT '单条日志类型',
`log` text NOT NULL COMMENT '单条日志',
`create_time` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '创建时间',
PRIMARY KEY (`id`)
);

160
src/api/Account.py Normal file
View File

@ -0,0 +1,160 @@
import time
from fastapi import (
BackgroundTasks,
Query,
Depends,
Security,
Body
)
from pydantic.main import BaseModel
from typing import Optional
from src.utils.logger import logger
from extras.obfastapi.frame import Trace, DataList
from extras.obfastapi.frame import OBResponse as Response
from src.base.code import Code
from src.router import ACCOUNT as account
from src.base.error_code import ErrorTemplate, Errors
from src.api.Controller import APIController as Controller
from src.dto.account import GithubAccount as GithubAccountData
from src.service.account import GithubAccountService, GiteeAccountService
from src.dto.account import CreateAccountItem, UpdateAccountItem
class Account(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token: str = None):
return super().get_user(cookie_key=cookie_key, token=token)
@account.get("/github_accounts", response_model=Response[DataList[GithubAccountData]], description='展示GitHub账号信息')
async def list_github_account(
self,
search: Optional[str] = Query(None, description='搜索内容'),
orderby: Optional[str] = Query(None, description='排序选项'),
pageNum: int = Query(1, description="Page number"),
pageSize: int = Query(10, description="Page size")
):
account_service = GithubAccountService()
if search is not None:
search = search.replace(" ", "")
count = await account_service.get_count(search=search)
ans = await account_service.list_github_account(search)
if ans is None:
logger.error("Github accounts fetch failed")
raise Errors.QUERY_FAILD
return Response(
code=Code.SUCCESS,
data=DataList(total=count, list=ans)
)
@ account.get("/gitee_accounts", response_model=Response[DataList[GithubAccountData]], description='展示Gitee账号信息')
async def list_gitee_account(
self,
search: Optional[str] = Query(False, description='搜索内容'),
orderby: Optional[str] = Query(False, description='排序选项'),
pageNum: int = Query(1, description="Page number"),
pageSize: int = Query(10, description="Page size")
):
account_service = GiteeAccountService()
count = await account_service.get_count()
ans = await account_service.list_gitee_account()
if ans is None:
logger.error("Gitee accounts fetch failed")
raise Errors.QUERY_FAILD
return Response(
code=Code.SUCCESS,
data=DataList(total=count, list=ans)
)
@ account.post("/github_accounts", response_model=Response, description='增加一条GitHub账号信息')
async def add_github_account(
self,
item: CreateAccountItem = (...)
):
account_service = GithubAccountService()
ans = await account_service.insert_github_account(item)
if ans is None:
logger.error(f"Insert Github accounts {item.domain} failed")
raise Errors.INSERT_FAILD
return Response(
code=Code.SUCCESS,
msg="添加账号成功",
)
@ account.post("/gitee_accounts", response_model=Response, description='增加一条Gitee账号信息')
async def add_gitee_account(
self,
item: CreateAccountItem = (...)
):
account_service = GiteeAccountService()
ans = await account_service.insert_gitee_account(item)
if ans is None:
logger.error(f"Insert Gitee accounts {item.domain} failed")
raise Errors.INSERT_FAILD
return Response(
code=Code.SUCCESS,
msg="添加账号成功",
)
@ account.delete("/github_accounts", response_model=Response, description='删除一条GitHub账号信息')
async def delete_github_account(
self,
id: int = Query(..., description="账号id")
):
if not id:
raise ErrorTemplate.ARGUMENT_LACK("删除用户账号")
account_service = GithubAccountService()
ans = await account_service.delete_github_account(id)
if not ans:
logger.error(f"Delete Github accounts failed")
raise Errors.DELETE_FAILD
return Response(
code=Code.SUCCESS,
msg='删除成功'
)
@ account.delete("/gitee_accounts", response_model=Response, description='删除一条Gitee账号信息')
async def delete_gitee_account(
self,
id: int = Query(..., description="账号id")
):
if not id:
raise ErrorTemplate.ARGUMENT_LACK("删除用户账号")
account_service = GiteeAccountService()
ans = await account_service.delete_gitee_account(id)
if not ans:
logger.error(f"Delete Gitee accounts failed")
raise Errors.DELETE_FAILD
return Response(
code=Code.SUCCESS,
msg='删除成功'
)
@ account.put("/github_accounts", response_model=Response, description='更新一条GitHub账号信息')
async def update_github_account(
self,
item: UpdateAccountItem = (...)
):
account_service = GithubAccountService()
ans = await account_service.update_github_account(item)
if not ans:
raise Errors.UPDATE_FAILD
return Response(
code=Code.SUCCESS,
msg='更新成功'
)
@ account.put("/gitee_accounts", response_model=Response, description='更新一条Gitee账号信息')
async def update_gitee_account(
self,
item: UpdateAccountItem = (...)
):
account_service = GiteeAccountService()
ans = await account_service.update_gitee_account(item)
if not ans:
raise Errors.UPDATE_FAILD
return Response(
code=Code.SUCCESS,
msg='更新成功'
)

52
src/api/Auth.py Normal file
View File

@ -0,0 +1,52 @@
from xmlrpc.client import Boolean
from fastapi import (
BackgroundTasks,
Query,
Depends,
Security,
Body
)
from pydantic.main import BaseModel
from src.utils.logger import logger
from extras.obfastapi.frame import Trace, DataList
from extras.obfastapi.frame import OBResponse as Response
from src.router import AUTH as auth
from src.base.code import Code
from src.api.Controller import APIController as Controller
from src.dto.auth import AuthItem
from src.base.error_code import ErrorTemplate, Errors
from src.common.repo import RepoType
from src.service.auth import AuthService
class Auth(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token: str = None):
return super().get_user(cookie_key=cookie_key, token=token)
@auth.post("/repo/auth", response_model=Response[Boolean], description='认证账号权限')
async def auth(
self,
item: AuthItem = Body(..., description='账号验证属性')
):
if not item:
raise ErrorTemplate.ARGUMENT_LACK("请求体")
if not item.type:
raise ErrorTemplate.ARGUMENT_LACK("账户类型")
if not item.token:
raise ErrorTemplate.ARGUMENT_LACK("账户token")
service = AuthService()
ans = service.auth(item)
if not ans:
return Response(
code=Code.SUCCESS,
data=False,
msg="账户认证失败"
)
return Response(
code=Code.SUCCESS,
data=True,
msg="账户认证成功"
)

32
src/api/Cerobot.py Normal file
View File

@ -0,0 +1,32 @@
from fastapi import (
Security
)
from pydantic.main import BaseModel
from src.utils.logger import logger
from extras.obfastapi.frame import Trace, DataList
from extras.obfastapi.frame import OBResponse as Response
from src.router import CE_ROBOT as ce_robot
from src.base.code import Code
from src.api.Controller import APIController as Controller
class Answer(BaseModel):
answer: str
class OBRobot(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token: str = None):
return super().get_user(cookie_key=cookie_key, token=token)
@ce_robot.get("", response_model=Response[Answer], description='Reposyncer')
async def get_ob_robot(
self
):
answer = Answer(answer="Hello ob-repository-sychronizer")
logger.info(f"Hello ob-repository-sychronizer")
return Response(
code=Code.SUCCESS,
data=answer
)

40
src/api/Controller.py Normal file
View File

@ -0,0 +1,40 @@
import json
import base64
from typing import Optional
from fastapi import Security
from src.base.config import TOKEN_KEY
from extras.obfastapi.frame import Controller, User
class APIController(Controller):
def decode_token(self, token: str) -> Optional[dict]:
s = ''
try:
for _s in token:
s += chr((ord(_s) - TOKEN_KEY) % 128)
s = base64.urlsafe_b64decode(s).decode('utf-8')
return json.loads(s)
except:
return None
def get_user(
self,
cookie_key: str = Security(Controller.API_KEY_BUC_COOKIE),
token: Optional[str] = None,
):
if token:
user = self.decode_token(token)
if user:
user = User(**user)
if user.emp_id:
self._user = user
if not self._user:
return super().get_user(cookie_key)
return self._user
def user(self):
user = "robot"
return user

27
src/api/Log.py Normal file
View File

@ -0,0 +1,27 @@
from fastapi import (
Security
)
from pydantic.main import BaseModel
from src.utils.logger import logger
from extras.obfastapi.frame import OBResponse as Response
from src.router import LOG as log
from src.base.code import Code
from src.api.Controller import APIController as Controller
from src.service.log import LogService
class Log(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token: str = None):
return super().get_user(cookie_key=cookie_key, token=token)
@log.delete("/log/delete", response_model=Response, description='删除日志')
async def delete_sync_logs(
self
):
service = LogService()
await service.delete_logs()
return Response(
code=Code.SUCCESS
)

181
src/api/PullRequest.py Normal file
View File

@ -0,0 +1,181 @@
import time
from fastapi import (
BackgroundTasks,
Query,
Depends,
Security,
Body
)
from typing import Optional
from starlette.exceptions import HTTPException
from src.utils.logger import logger
from extras.obfastapi.frame import Trace, DataList
from extras.obfastapi.frame import OBResponse as Response
from src.base.code import Code
from src.base.error_code import ErrorTemplate, Errors
from src.router import PULL_REQUEST as pull_request
from src.api.Controller import APIController as Controller
from src.dto.pull_request import PullRequest as PullRequestData
from src.service.pull_request import PullRequestService
from src.service.sync import ProjectService
from src.utils import github
class PullRequest(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token: str = None):
return super().get_user(cookie_key=cookie_key, token=token)
@pull_request.get("/projects/{name}/pullrequests", response_model=Response[DataList[PullRequestData]], description='列出pull request')
async def list_pull_request(
self,
name: str = Query(..., description='工程名字'),
search: Optional[str] = Query(None, description='搜索内容'),
orderby: Optional[str] = Query(None, description='排序选项')
):
await self._check_project(name)
pull_request_service = PullRequestService()
count = await pull_request_service.count_pull_request(name)
answer = await pull_request_service.fetch_pull_request(name)
if not answer:
logger.info(f"The project {name} has no pull request")
answer = []
return Response(
code=Code.SUCCESS,
data=DataList(total=count, list=answer)
)
@pull_request.get("/projects/{name}/pullrequests/sync", response_model=Response, description='列出pull request')
async def sync_pull_request(
self,
name: str = Query(..., description='工程名字')
):
resp = await self._check_project(name)
organization, repo = github.transfer_github_to_name(
resp[0].github_address)
if organization and repo:
pull_request_service = PullRequestService()
await pull_request_service.sync_pull_request(name, organization, repo)
else:
logger.error(f"The pull rquest of project {name} sync failed")
raise Errors.QUERY_FAILD
return Response(
code=Code.SUCCESS,
msg="发送同意请求成功"
)
@pull_request.get("/projects/{name}/pullrequests/{id}/approve", response_model=Response, description='同意一个pull request')
async def approve_pull_request(
self,
name: str = Query(..., description='同步工程名称'),
id: int = Query(..., description='pull request id')
):
if not name or not id:
raise ErrorTemplate.ARGUMENT_LACK()
resp = await self._check_project(name)
organization, repo = github.transfer_github_to_name(
resp[0].github_address)
if organization and repo:
pull_request_service = PullRequestService()
resp = await pull_request_service.approve_pull_request(organization, repo, id)
if not resp:
logger.error(
f"The pull rquest #{id} of project {name} approve failed")
raise Errors.QUERY_FAILD
else:
logger.error(
f"Get the project {name} organization and repo failed")
raise Errors.QUERY_FAILD
return Response(
code=Code.SUCCESS,
msg="发送同意请求成功"
)
@pull_request.get("/projects/{name}/pullrequests/{id}/merge", response_model=Response, description='合并一个pull request')
async def merge_pull_request(
self,
name: str = Query(..., description='同步工程名称'),
id: int = Query(..., description='pull request id')
):
if not name or not id:
raise ErrorTemplate.ARGUMENT_LACK()
resp = await self._check_project(name)
organization, repo = github.transfer_github_to_name(
resp[0].github_address)
if organization and repo:
pull_request_service = PullRequestService()
resp = await pull_request_service.merge_pull_request(organization, repo, id)
if not resp:
logger.error(
f"The pull rquest #{id} of project {name} merge failed")
raise Errors.QUERY_FAILD
else:
logger.error(
f"Get the project {name} organization and repo failed")
raise Errors.QUERY_FAILD
return Response(
code=Code.SUCCESS,
msg="发送合并请求成功"
)
@pull_request.get("/projects/{name}/pullrequests/{id}/close", response_model=Response, description='关闭一个pull request')
async def close_pull_request(
self,
name: str = Query(..., description='同步工程名称'),
id: int = Query(..., description='pull request id')
):
if not name or not id:
raise ErrorTemplate.ARGUMENT_LACK()
resp = await self._check_project(name)
organization, repo = github.transfer_github_to_name(
resp[0].github_address)
if organization and repo:
pull_request_service = PullRequestService()
resp = await pull_request_service.close_pull_request(organization, repo, id)
if not resp:
logger.error(
f"The pull rquest #{id} of project {name} close failed")
raise Errors.QUERY_FAILD
else:
logger.error(
f"Get the project {name} organization and repo failed")
raise Errors.QUERY_FAILD
return Response(
code=Code.SUCCESS,
msg="发送关闭请求成功"
)
@pull_request.get("/projects/{name}/pullrequests/{id}/press", response_model=Response, description='催促一个pull request')
async def press_pull_request(
self,
name: str = Query(..., description='同步工程名称'),
id: int = Query(..., description='pull request id')
):
# await self._check_project(name)
# service = PullRequestService()
# resp = await service.press_pull_request()
# if not resp:
# code = Code.INVALID_PARAMS
# msg = "发送催促请求失败"
# else:
# code = Code.SUCCESS
# msg = "发送催促请求成功"
return Response(
code=Code.SUCCESS,
msg="第二期功能,敬请期待"
)
async def _check_project(self, name: str):
project_service = ProjectService()
resp = await project_service.search_project(name=name)
if len(resp) == 0:
logger.error(
f"The project {name} is not exist")
raise Errors.QUERY_FAILD
return resp

328
src/api/Sync.py Normal file
View File

@ -0,0 +1,328 @@
import time
from fastapi import (
BackgroundTasks,
Query,
Depends,
Security,
Body
)
from pydantic.main import BaseModel
from typing import Optional
import asyncio
from sqlalchemy.sql.expression import false
from src.base.error_code import ErrorTemplate, Errors
from src.utils.logger import logger
from extras.obfastapi.frame import Trace, DataList
from extras.obfastapi.frame import OBResponse as Response
from extras.obfastapi.frame import OBHTTPException as HTTPException
from src.base.code import Code
from src.router import PROJECT as project
from src.router import JOB as job
from src.api.Controller import APIController as Controller
from src.dto.sync import Project as ProjectData
from src.dto.sync import Job as JobData
from src.dto.log import Log as LogData
from src.dto.sync import SyncType, CreateProjectItem, CreateJobItem
from src.service.sync import ProjectService, JobService
from src.service.pull_request import PullRequestService
from src.service.log import LogService
from src.utils import github, gitlab, gitee, gitcode, gitlink
class Project(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token: str = None):
return super().get_user(cookie_key=cookie_key, token=token)
@project.get("", response_model=Response[DataList[ProjectData]], description='通过工程名获取一个同步工程')
async def get_project(
self,
search: Optional[str] = Query(None, description='同步工程搜索内容'),
orderby: Optional[str] = Query(None, description='排序选项'),
pageNum: Optional[int] = Query(1, description="Page number"),
pageSize: Optional[int] = Query(10, description="Page size")
):
# search
service = ProjectService()
if search is None:
count = await service.get_count()
answer = await service.list_projects(page=pageNum, size=pageSize)
else:
count = await service.get_count_by_search(search.replace(" ", ""))
answer = await service.search_project(name=search.replace(" ", ""))
if answer is None:
logger.error(f"The project list fetch failed")
raise Errors.QUERY_FAILD
return Response(
code=Code.SUCCESS,
data=DataList(total=count, list=answer)
)
@ project.post("", response_model=Response[ProjectData], description='创建一个同步工程')
async def create_project(
self,
item: CreateProjectItem = Body(..., description='同步工程属性')
):
# pre check
if not item:
raise ErrorTemplate.ARGUMENT_LACK("请求体")
if not item.name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
if item.github_address:
if not github.check_github_address(item.github_address):
raise ErrorTemplate.TIP_ARGUMENT_ERROR("GitHub仓库")
if item.gitlab_address:
if not gitlab.check_gitlab_address(item.gitlab_address):
raise ErrorTemplate.TIP_ARGUMENT_ERROR("Gitlab/Antcode仓库")
if item.gitee_address:
if not gitee.check_gitee_address(item.gitee_address):
raise ErrorTemplate.TIP_ARGUMENT_ERROR("Gitee仓库")
if item.code_china_address:
if not gitcode.check_gitcode_address(item.code_china_address):
raise ErrorTemplate.TIP_ARGUMENT_ERROR("CodeChina仓库")
# if item.gitlink_address:
# if not gitlink.check_gitlink_address(item.gitlink_address):
# raise ErrorTemplate.ARGUMENT_ERROR("Gitlink仓库")
service = ProjectService()
resp = await service.insert_project(item)
if not resp:
logger.error(f"The project insert failed")
raise Errors.INSERT_FAILD
organization, repo = github.transfer_github_to_name(
item.github_address)
if organization and repo:
pull_request_service = PullRequestService()
task = asyncio.create_task(
pull_request_service.sync_pull_request(item.name, organization, repo))
return Response(
code=Code.SUCCESS,
data=resp,
msg="创建同步工程成功"
)
@ project.delete("", response_model=Response, description='通过id删除一个同步工程')
async def delete_project(
self,
id: int = Query(..., description='同步工程id')
):
if not id:
raise ErrorTemplate.ARGUMENT_LACK("id")
# if delete the project, the front page double check firstly
project_service = ProjectService()
project = await project_service.search_project(id=id)
name = project[0].name
# delete pull request
pull_request_service = PullRequestService()
resp = await pull_request_service.fetch_pull_request(name)
if resp:
if len(resp) > 0:
for pr in resp:
await pull_request_service.delete_pull_request(pr.id)
# delete sync job
job_service = JobService()
resp = await job_service.list_jobs(project=name)
if not resp:
pass
else:
for item in resp:
await job_service.delete_job(item.id)
# delete sync project
resp = await project_service.delete_project(id)
if not resp:
logger.error(f"The project #{id} delete failed")
raise Errors.DELETE_FAILD
return Response(
code=Code.SUCCESS,
msg="删除同步工程成功"
)
class Job(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token: str = None):
return super().get_user(cookie_key=cookie_key, token=token)
@ job.get("/projects/{name}/jobs", response_model=Response[DataList[JobData]], description='列出所有同步流')
async def list_jobs(
self,
name: str = Query(..., description='同步工程名'),
search: Optional[str] = Query(None, description='同步工程搜索内容'),
source: Optional[str] = Query(None, description='分支来源'),
pageNum: Optional[int] = Query(1, description="Page number"),
pageSize: Optional[int] = Query(10, description="Page size")
):
if not name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
service = JobService()
if search is not None:
search = search.replace(" ", "")
answer = await service.list_jobs(project=name, search=search, source=source, page=pageNum, size=pageSize)
if not answer:
return Response(
code=Code.SUCCESS,
data=DataList(total=0, list=[]),
msg="没有同步流"
)
count = await service.count_job(project=name, search=search, source=source)
return Response(
code=Code.SUCCESS,
data=DataList(total=count, list=answer),
msg="查询同步流成功"
)
@ job.post("/projects/{name}/jobs", response_model=Response[JobData], description='创建一个同步流')
async def create_job(
self,
name: str = Query(..., description='同步工程名'),
item: CreateJobItem = Body(..., description='同步流属性')
):
if not name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
if not item:
raise ErrorTemplate.ARGUMENT_LACK("JSON")
if not item.type:
raise ErrorTemplate.ARGUMENT_LACK("分支同步类型")
service = JobService()
ans = await service.create_job(name, item)
if not ans:
logger.error(f"Create a job of project #{name} failed")
raise Errors.INSERT_FAILD
return Response(
code=Code.SUCCESS,
data=ans,
msg="创建同步流成功"
)
@ job.put("/projects/{name}/jobs/{id}/start", response_model=Response, description='开启一个同步流')
async def start_job(
self,
name: str = Query(..., description='同步工程名'),
id: int = Query(..., description='同步流id')
):
if not name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
if not id:
raise ErrorTemplate.ARGUMENT_LACK("同步流id")
service = JobService()
ans = await service.update_status(id, True)
if not ans:
logger.error(f"The job #{id} of project #{name} start failed")
raise Errors.UPDATE_FAILD
return Response(
code=Code.SUCCESS,
msg="开启同步流成功"
)
@ job.put("/projects/{name}/jobs/{id}/stop", response_model=Response, description='停止一个同步流')
async def stop_job(
self,
name: str = Query(..., description='同步工程名'),
id: int = Query(..., description='同步流id')
):
if not name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
if not id:
raise ErrorTemplate.ARGUMENT_LACK("同步流id")
service = JobService()
ans = await service.update_status(id, False)
if not ans:
logger.error(f"The job #{id} of project #{name} stop failed")
raise Errors.UPDATE_FAILD
return Response(
code=Code.SUCCESS,
msg="关闭同步流成功"
)
@ job.delete("/projects/{name}/jobs", response_model=Response, description='通过id删除一个同步流')
async def delete_job(
self,
name: str = Query(..., description='同步工程名'),
id: int = Query(..., description='同步流id')
):
if not name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
if not id:
raise ErrorTemplate.ARGUMENT_LACK("同步流id")
service = JobService()
ans = await service.delete_job(id)
if not ans:
logger.error(f"The job #{id} of project #{name} delete failed")
raise Errors.DELETE_FAILD
return Response(
code=Code.SUCCESS,
msg="删除同步流成功"
)
@ job.put("/projects/{name}/jobs/{id}/set_commit", response_model=Response, description='通过id设置一个同步流的commit')
async def set_job_commit(
self,
name: str = Query(..., description='同步工程名'),
id: int = Query(..., description='同步流id'),
commit: str = Query(..., description='commit'),
):
if not name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
if not id:
raise ErrorTemplate.ARGUMENT_LACK("同步流id")
service = JobService()
job = await service.get_job(id)
if not job:
logger.error(f"The job #{id} of project #{name} is not exist")
raise Errors.UPDATE_FAILD
# only the sync type is oneway can use the commit
if job.type == SyncType.TwoWay:
logger.error(f"The job #{id} of project #{name} is two way sync")
raise HTTPException(Code.OPERATION_FAILED, 'Twoway同步方式无法修改commit值')
ans = await service.update_job_lateset_commit(id, commit)
if not ans:
logger.error(
f"The job #{id} of project #{name} update latest commit failed")
raise Errors.UPDATE_FAILD
return Response(
code=Code.SUCCESS,
msg="设置同步流commit成功"
)
@ job.get("/projects/{name}/jobs/{id}/logs", response_model=Response[DataList[LogData]], description='列出所有同步流')
async def get_job_log(
self,
name: str = Query(..., description='同步工程名'),
id: int = Query(..., description='同步流id'),
pageNum: Optional[int] = Query(1, description="Page number"),
pageSize: Optional[int] = Query(1000, description="Page size")
):
if not name:
raise ErrorTemplate.ARGUMENT_LACK("工程名")
if not id:
raise ErrorTemplate.ARGUMENT_LACK("同步流id")
project_service = ProjectService()
projects = await project_service.search_project(name=name)
if len(projects) == 0:
raise ErrorTemplate.ARGUMENT_ERROR("工程名")
service = LogService()
log = await service.get_logs_by_job(id, pageNum, pageSize)
data = []
for rep_log in log:
log_str = rep_log.log
if projects[0].gitee_token:
log_str = log_str.replace(projects[0].gitee_token, "******")
if projects[0].github_token:
log_str = log_str.replace(projects[0].github_token, "******")
rep_log.log = log_str
data.append(rep_log)
if len(log) == 0:
logger.info(
f"The job #{id} of project #{name} has no logs")
count = await service.count_logs(id)
return Response(
code=Code.SUCCESS,
data=DataList(total=count, list=data)
)

256
src/api/Sync_config.py Normal file
View File

@ -0,0 +1,256 @@
import time
from fastapi import (
Body,
Path,
Depends,
Query,
Security
)
from typing import Dict
from starlette.requests import Request
from src.utils import base
from src.utils.sync_log import sync_log, LogType, api_log
from src.api.Controller import APIController as Controller
from src.router import SYNC_CONFIG as router
from src.do.sync_config import SyncDirect
from src.dto.sync_config import SyncRepoDTO, SyncBranchDTO, LogDTO
from src.service.sync_config import SyncService, LogService
from src.service.cronjob import sync_repo_task, sync_branch_task
from src.base.status_code import Status, SYNCResponse, SYNCException
from src.service.cronjob import GITMSGException
class SyncDirection(Controller):
def __init__(self, *args, **kwargs):
self.service = SyncService()
self.log_service = LogService()
super().__init__(*args, **kwargs)
# 提供获取操作人员信息定义接口, 无任何实质性操作
def user(self):
return super().user()
@router.post("/repo", response_model=SYNCResponse, description='配置同步仓库')
async def create_sync_repo(
self, request: Request, user: str = Depends(user),
dto: SyncRepoDTO = Body(..., description="绑定同步仓库信息")
):
api_log(LogType.INFO, f"用户 {user} 使用 POST 方法访问接口 {request.url.path} ", user)
if not base.check_addr(dto.external_repo_address) or not base.check_addr(dto.internal_repo_address):
return SYNCResponse(
code_status=Status.REPO_ADDR_ILLEGAL.code,
msg=Status.REPO_ADDR_ILLEGAL.msg
)
if dto.sync_granularity not in [1, 2]:
return SYNCResponse(code_status=Status.SYNC_GRAN_ILLEGAL.code, msg=Status.SYNC_GRAN_ILLEGAL.msg)
if dto.sync_direction not in [1, 2]:
return SYNCResponse(code_status=Status.SYNC_DIRE_ILLEGAL.code, msg=Status.SYNC_DIRE_ILLEGAL.msg)
if await self.service.same_name_repo(repo_name=dto.repo_name):
return SYNCResponse(
code_status=Status.REPO_EXISTS.code,
msg=Status.REPO_EXISTS.msg
)
repo = await self.service.create_repo(dto)
return SYNCResponse(
code_status=Status.SUCCESS.code,
data=repo,
msg=Status.SUCCESS.msg
)
@router.post("/{repo_name}/branch", response_model=SYNCResponse, description='配置同步分支')
async def create_sync_branch(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称"),
dto: SyncBranchDTO = Body(..., description="绑定同步分支信息")
):
api_log(LogType.INFO, f"用户 {user} 使用 POST 方法访问接口 {request.url.path} ", user)
try:
repo_id = await self.service.check_status(repo_name, dto)
except SYNCException as Error:
return SYNCResponse(
code_status=Error.code_status,
msg=Error.status_msg
)
branch = await self.service.create_branch(dto, repo_id=repo_id)
return SYNCResponse(
code_status=Status.SUCCESS.code,
data=branch,
msg=Status.SUCCESS.msg
)
@router.get("/repo", response_model=SYNCResponse, description='获取同步仓库信息')
async def get_sync_repos(
self, request: Request, user: str = Depends(user),
page_num: int = Query(1, description="页数"), page_size: int = Query(10, description="条数"),
create_sort: bool = Query(False, description="创建时间排序, 默认倒序")
):
api_log(LogType.INFO, f"用户 {user} 使用 GET 方法访问接口 {request.url.path} ", user)
repos = await self.service.get_sync_repo(page_num=page_num, page_size=page_size, create_sort=create_sort)
if repos is None:
return SYNCResponse(
code_status=Status.NOT_DATA.code,
msg=Status.NOT_DATA.msg
)
return SYNCResponse(
code_status=Status.SUCCESS.code,
data=repos,
msg=Status.SUCCESS.msg
)
@router.get("/{repo_name}/branch", response_model=SYNCResponse, description='获取仓库对应的同步分支信息')
async def get_sync_branches(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="查询的仓库名称"),
page_num: int = Query(1, description="页数"), page_size: int = Query(10, description="条数"),
create_sort: bool = Query(False, description="创建时间排序, 默认倒序")
):
api_log(LogType.INFO, f"用户 {user} 使用 GET 方法访问接口 {request.url.path} ", user)
try:
repo_id = await self.service.get_repo_id(repo_name=repo_name)
except SYNCException as Error:
return SYNCResponse(
code_status=Error.code_status,
msg=Error.status_msg
)
branches = await self.service.get_sync_branches(repo_id=repo_id, page_num=page_num,
page_size=page_size, create_sort=create_sort)
if len(branches) < 1:
return SYNCResponse(
code_status=Status.NOT_DATA.code,
msg=Status.NOT_DATA.msg
)
return SYNCResponse(
code_status=Status.SUCCESS.code,
data=branches,
msg=Status.SUCCESS.msg
)
@router.post("/repo/{repo_name}", response_model=SYNCResponse, description='执行仓库同步')
async def sync_repo(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称")
):
api_log(LogType.INFO, f"用户 {user} 使用 POST 方法访问接口 {request.url.path} ", user)
repo = await self.service.get_repo(repo_name=repo_name)
if repo is None:
return SYNCResponse(code_status=Status.REPO_NOTFOUND.code, msg=Status.REPO_NOTFOUND.msg)
if not repo.enable:
return SYNCResponse(code_status=Status.NOT_ENABLE.code, msg=Status.NOT_ENABLE.msg)
try:
await sync_repo_task(repo, user)
except GITMSGException as GITError:
return SYNCResponse(
code_status=GITError.status,
msg=GITError.msg
)
return SYNCResponse(
code_status=Status.SUCCESS.code,
msg=Status.SUCCESS.msg
)
@router.post("/{repo_name}/branch/{branch_name}", response_model=SYNCResponse, description='执行分支同步')
async def sync_branch(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称"),
branch_name: str = Path(..., description="分支名称"),
sync_direct: int = Query(..., description="同步方向: 1 表示内部仓库同步到外部, 2 表示外部仓库同步到内部")
):
api_log(LogType.INFO, f"用户 {user} 使用 POST 方法访问接口 {request.url.path} ", user)
repo = await self.service.get_repo(repo_name=repo_name)
if not repo.enable:
return SYNCResponse(code_status=Status.NOT_ENABLE.code, msg=Status.NOT_ENABLE.msg)
if sync_direct not in [1, 2]:
return SYNCResponse(code_status=Status.SYNC_DIRE_ILLEGAL.code, msg=Status.SYNC_DIRE_ILLEGAL.msg)
direct = SyncDirect(sync_direct)
branches = await self.service.sync_branch(repo_id=repo.id, branch_name=branch_name, dire=direct)
if len(branches) < 1:
return SYNCResponse(code_status=Status.NOT_ENABLE.code, msg=Status.NOT_ENABLE.msg)
try:
await sync_branch_task(repo, branches, direct, user)
except GITMSGException as GITError:
return SYNCResponse(
code_status=GITError.status,
msg=GITError.msg
)
return SYNCResponse(code_status=Status.SUCCESS.code, msg=Status.SUCCESS.msg)
@router.delete("/repo/{repo_name}", response_model=SYNCResponse, description='仓库解绑')
async def delete_repo(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称")
):
api_log(LogType.INFO, f"用户 {user} 使用 DELETE 方法访问接口 {request.url.path} ", user)
data = await self.service.delete_repo(repo_name=repo_name)
return SYNCResponse(
code_status=data.code_status,
msg=data.status_msg
)
@router.delete("/{repo_name}/branch/{branch_name}", response_model=SYNCResponse, description='分支解绑')
async def delete_branch(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称"),
branch_name: str = Path(..., description="分支名称")
):
api_log(LogType.INFO, f"用户 {user} 使用 DELETE 方法访问接口 {request.url.path} ", user)
data = await self.service.delete_branch(repo_name=repo_name, branch_name=branch_name)
return SYNCResponse(
code_status=data.code_status,
msg=data.status_msg
)
@router.put("/repo/{repo_name}", response_model=SYNCResponse, description='更新仓库同步状态')
async def update_repo_status(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称"),
enable: bool = Query(..., description="同步启用状态")
):
api_log(LogType.INFO, f"用户 {user} 使用 PUT 方法访问接口 {request.url.path} ", user)
data = await self.service.update_repo(repo_name=repo_name, enable=enable)
return SYNCResponse(
code_status=data.code_status,
msg=data.status_msg
)
@router.put("/{repo_name}/branch/{branch_name}", response_model=SYNCResponse, description='更新分支同步状态')
async def update_branch_status(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称"),
branch_name: str = Path(..., description="分支名称"),
enable: bool = Query(..., description="同步启用状态")
):
api_log(LogType.INFO, f"用户 {user} 使用 PUT 方法访问接口 {request.url.path} ", user)
data = await self.service.update_branch(repo_name=repo_name, branch_name=branch_name, enable=enable)
return SYNCResponse(
code_status=data.code_status,
msg=data.status_msg
)
@router.get("/repo/{repo_name}/logs", response_model=SYNCResponse, description='获取仓库/分支日志')
async def get_logs(
self, request: Request, user: str = Depends(user),
repo_name: str = Path(..., description="仓库名称"),
branch_id: int = Query(None, description="分支id仓库粒度无需输入")
):
api_log(LogType.INFO, f"用户 {user} 使用 GET 方法访问接口 {request.url.path} ", user)
data = await self.log_service.get_logs(repo_name=repo_name, branch_id=branch_id)
return SYNCResponse(
code_status=Status.SUCCESS.code,
data=data,
msg=Status.SUCCESS.msg
)

26
src/api/User.py Normal file
View File

@ -0,0 +1,26 @@
from fastapi import Security, Depends
from src.dto.user import UserInfoDto
from extras.obfastapi.frame import OBResponse as Response
from src.api.Controller import APIController as Controller
from src.router import USER as user
class User(Controller):
def get_user(self, cookie_key=Security(Controller.API_KEY_BUC_COOKIE), token=None):
return super().get_user(cookie_key=cookie_key, token=token)
@user.get("/info", response_model=Response[UserInfoDto], description="获得用户信息")
async def get_user_info(
self,
user: Security = Depends(get_user)
):
return Response(
data=UserInfoDto(
name=user.name,
nick=user.nick,
emp_id=user.emp_id,
email=user.email,
dept=user.dept
)
)

0
src/base/__init__.py Normal file
View File

16
src/base/code.py Normal file
View File

@ -0,0 +1,16 @@
class Code:
SUCCESS = 200
ERROR = 500
INVALID_PARAMS = 400
FORBIDDEN = 403
NOT_FOUND = 404
OPERATION_FAILED = 406
class LogType:
INFO = 'info'
ERROR = 'ERROR'
WARNING = 'warning'
DEBUG = "debug"

168
src/base/config.py Normal file
View File

@ -0,0 +1,168 @@
# coding: utf-8
import os
from extras.obfastapi.config import ConfigsUtil, MysqlConfig, RedisConfig
def getenv(key, default=None, _type=None):
value = os.getenv(key)
if value:
if _type == bool:
return value.lower() == 'true'
else:
return _type(value) if _type else value
else:
return default
LOCAL_ENV = 'LOCAL'
DEV_ENV = 'DEV'
PORD_ENV = 'PROD'
SYS_ENV = getenv('SYS_ENV', LOCAL_ENV)
LOG_PATH = getenv('LOG_PATH')
LOG_LEVEL = getenv('LOG_LV', 'DEBUG')
LOG_SAVE = getenv('LOG_SAVE', True)
DELETE_SYNC_DIR = getenv('DELETE_SYNC_DIR', False)
LOG_DETAIL = getenv('LOG_DETAIL', True)
SYNC_DIR = os.getenv("SYNC_DIR", "/tmp/sync_dir/")
buc_key = getenv('BUC_KEY', "OBRDE_DEV_USER_SIGN")
buc_key and ConfigsUtil.set_obfastapi_config('buc_key', buc_key)
DB_ENV = getenv('DB_ENV', 'test_env')
DB = {
'test_env': {
'host': getenv('CEROBOT_MYSQL_HOST', ''),
'port': getenv('CEROBOT_MYSQL_PORT', 2883, int),
'user': getenv('CEROBOT_MYSQL_USER', ''),
'passwd': getenv('CEROBOT_MYSQL_PWD', ''),
'dbname': getenv('CEROBOT_MYSQL_DB', '')
},
'local': {
'host': getenv('CEROBOT_MYSQL_HOST', ''),
'port': getenv('CEROBOT_MYSQL_PORT', 2881, int),
'user': getenv('CEROBOT_MYSQL_USER', ''),
'passwd': getenv('CEROBOT_MYSQL_PWD', ''),
'dbname': getenv('CEROBOT_MYSQL_DB', '')
}
}
for key in DB:
conf = MysqlConfig(**DB[key])
ConfigsUtil.set_mysql_config(key, conf)
SERVER_HOST = getenv('SERVER_HOST', '')
TOKEN_KEY = int(getenv("TOKEN_KEY", 1))
ACCOUNT = {
'username': getenv('OB_ROBOT_USERNAME', ''),
'email': getenv('OB_ROBOT_USERNAME', ''),
'github_token': getenv('GITHUB_TOKEN', ''),
'gitee_token': getenv('GITEE_TOKEN', ''),
'gitlab_token': getenv('GITLAB_TOKEN', ''), # 暂时还是我的token待替代为一个内部账号
'antcode_token': getenv('ANTCODE_TOKEN', ''), # 暂时还是我的token待替代为一个内部账号
'gitcode_token': getenv('GITCODE_TOKEN', ''), # 暂时还是我的token待替代为ob-robot账号
'robot_code_token': getenv('ROBOT_CODE_TOKEN', ''),
'robot_antcode_token': getenv('ROBOT_ANTCODE_TOKEN', '')
}
GITLAB_ENV = {
'gitlab_api_address': getenv('GITLAB_API_HOST', ''),
'gitlab_api_pullrequest_address': getenv('GITLAB_API_PULLREQUEST_HOST', '')
}
GITHUB_ENV = {
'github_api_address': getenv('GITHUB_API_HOST', ''),
'github_api_diff_address': getenv('GITHUB_API_DIFF_HOST', ''),
}
GITEE_ENV = {
'gitee_api_address': getenv('GITEE_API_HOST', 'https://api.gitee.com/repos'),
'gitee_api_diff_address': getenv('GITEE_API_DIFF_HOST', 'https://gitee.com'),
}
GITLINK_ENV = {
'gitlink_api_address': getenv('GITLINK_API_HOST', ''),
'gitlink_api_diff_address': getenv('GITLINK_API_DIFF_HOST', ''),
}
GITCODE_ENV = {
'gitcode_api_address': getenv('GITCODE_API_HOST', ''),
'gitcode_api_diff_address': getenv('GITCODE_API_DIFF_HOST', ''),
}
DOCKER_ENV = {
'el7': getenv('EL7_DOCKER_IMAGE'),
'el8': getenv('EL8_DOCKER_IMAGE')
}
ERROR_REPORT_EMAIL = getenv('EORROR_TO', '')
DEFAULT_DK_RECEIVERS = getenv('DF_DK_TO', '')
DEFAULT_EMAIL_RECEIVERS = getenv('DF_EMAIL_TO', '')
NOTIFY = {
# todo yaml当中配置新的生产环境host
'host': getenv('NOTIFY_HOST', ''),
'user': getenv('NOTIFY_USER', ''), # 一般当前项目的项目名
'report_sender': getenv('NOTIFY_REPORT_SENDER', ''),
'email_sender': getenv('EMAIL_NOTIFY_SENDER'),
'dk_sender': getenv('DK_NOTIFY_SENDER', ''),
'dd_sender': getenv('DD_NOTIFY_SENDER', ''),
'dd_sender_issue': getenv('DD_NOTIFY_SENDER_ISSUE', ''),
'dd_sender_log': getenv('DD_NOTIFY_SENDER_LOG', ''),
'dd_sender_ghpr': getenv('DD_NOTIFY_SENDER_PR', ''),
}
# log level
ConfigsUtil.set_obfastapi_config('log_level', 'WARN')
# ConfigsUtil.set_obfastapi_config('log_name', 'mysql_test')
# ConfigsUtil.set_obfastapi_config('log_path', 'test.log')
SECRET_SCAN = getenv('SECRET_SCAN', True)
# Symmetric encryption key
DATA_ENCRYPT_KEY = getenv('DATA_ENCRYPT_KEY', '')
# web base_url
base_url = getenv('OB_ROBOT_BASE_URL', '')
CACHE_DB = {
"cerobot": {
"host": getenv("CEROBOT_REDIS_HOST", ""),
"port": getenv("CEROBOT_REDIS_PORT", 6379, int),
"password": getenv("CEROBOT_REDIS_PWD", "")
}
}
for key in CACHE_DB:
conf = RedisConfig(**CACHE_DB[key])
ConfigsUtil.set_redis_config(key, conf)
GIT_DEPTH = getenv('GIT_DEPTH', 100, int)
SECRET_SCAN_THREAD = getenv('SECRET_SCAN_THREAD', 4, int)
OCEANBASE = getenv('OCEANBASE', 'ob-mirror')
OBProjectIdInAone = getenv('OB_PROJECT_ID_AONE', 2015510, int)
strip_name = getenv("STRIP_NAME", "/.ce")
# observer repo cache
OCEANBASE_REPO_BASE_DIR = getenv('OCEANBASE_REPO_BASE_DIR', '')
# oceanbase internal repo, create github feature branch
OCEANBASE_REPO = getenv('OCEANBASE_REPO', '')
# oceanbase ce publish repo, the ob backup repo
# it need origin(oceanbase-ce-publish oceanbase) and github(github oceanbase) git configration
OCEANBASE_BACKUP_REPO = getenv('OCEANBASE_BACKUP_REPO', '')
# github repo
OCEANBASE_GITHUB_REPO = getenv('OCEANBASE_GITHUB_REPO', '')
ROLE_CHECK = getenv('ROLE_CHECK', True, bool)
# oss config
OSS_CONFIG = {
"id": getenv("OSSIV", ""),
"secret": getenv("OSSKEY", ""),
"bucket": getenv("OSSBUCKET", ""),
"endpoint": getenv("OSSENDPOINT", ""),
"download_base": getenv("DOWNLOAD_BASE", "")
}

26
src/base/error_code.py Normal file
View File

@ -0,0 +1,26 @@
from .code import Code
from extras.obfastapi.frame import OBHTTPException as HTTPException
class Errors:
FORBIDDEN = HTTPException(Code.FORBIDDEN, '权限不足')
QUERY_FAILD = HTTPException(Code.OPERATION_FAILED, '记录查询失败')
INSERT_FAILD = HTTPException(Code.OPERATION_FAILED, '记录插入失败')
DELETE_FAILD = HTTPException(Code.OPERATION_FAILED, '记录删除失败')
UPDATE_FAILD = HTTPException(Code.OPERATION_FAILED, '记录更新失败')
METHOD_EORROR = HTTPException(Code.INVALID_PARAMS, '错误的请求方式')
NOT_INIT = HTTPException(555, '服务器缺少配置, 未能完成初始化')
class ErrorTemplate:
def ARGUMENT_LACK(did): return HTTPException(
Code.NOT_FOUND, '参数[%s]不能为空' % did)
def ARGUMENT_ERROR(did): return HTTPException(
Code.NOT_FOUND, '参数[%s]有错' % did)
def TIP_ARGUMENT_ERROR(did): return HTTPException(
Code.NOT_FOUND, '请输入正确的%s地址' % did)

93
src/base/status_code.py Normal file
View File

@ -0,0 +1,93 @@
from enum import Enum, unique
from pydantic import BaseModel
from typing import Optional, Generic, TypeVar, Dict, Any
Data = TypeVar('Data')
@unique
class Status(Enum):
# 成功返回
SUCCESS = (0, "操作成功")
# 请求异常
REPO_ADDR_ILLEGAL = (10001, "仓库地址格式有误,请检查")
REPO_EXISTS = (10002, "仓库已存在,请勿重复创建。如果同步方向不同,请更换易识别名称再次创建")
BRANCH_EXISTS = (10003, "分支已存在,请勿重复绑定")
GRANULARITY_ERROR = (10004, "仓库粒度同步,无需添加分支信息")
NOT_FOUND = (10005, "分支信息获取为空")
NOT_ENABLE = (10006, "仓库/分支未启用同步,请检查更新同步启用状态")
SYNC_GRAN_ILLEGAL = (10007, "sync_granularity: 1 表示仓库粒度的同步, 2 表示分支粒度的同步")
SYNC_DIRE_ILLEGAL = (10008, "sync_direction: 1 表示内部仓库同步到外部, 2 表示外部仓库同步到内部")
REPO_NULL = (10009, "仓库未绑定,请先绑定仓库,再绑定分支")
REPO_NOTFOUND = (10010, "未查找到仓库")
GRANULARITY_DELETE = (10011, "仓库粒度同步,没有分支可解绑")
BRANCH_DELETE = (10012, "仓库中不存在此分支")
NOT_DATA = (10013, "没有数据")
GRANULARITY = (10014, "仓库粒度同步,没有分支信息")
# git执行异常
PERMISSION_DENIED = (20001, "SSH 密钥未授权或未添加")
REPO_NOT_FOUND = (20002, "仓库不存在或私有仓库访问权限不足")
RESOLVE_HOST_FAIL = (20003, "无法解析主机")
CONNECT_TIME_OUT = (20004, "连接超时")
AUTH_FAIL = (20005, "认证失败 (用户名和密码、个人访问令牌、SSH 密钥等)")
CREATE_WORK_TREE_FAIL = (20006, "没有权限在指定的本地目录创建文件或目录")
DIRECTORY_EXIST = (20007, "本地目录冲突 (本地已存在同名目录,无法创建新的工作树目录)")
NOT_REPO = (20008, "当前的工作目录不是一个git仓库")
NOT_BRANCH = (20009, "分支不存在")
PUST_REJECT = (20010, "推送冲突")
REFUSE_PUST = (20011, "推送到受保护的分支被拒绝")
UNKNOWN_ERROR = (20012, "Unknown git error.")
@property
def code(self) -> int:
# 返回状态码信息
return self.value[0]
@property
def msg(self) -> str:
# 返回状态码说明信息
return self.value[1]
git_error_mapping = {
"Permission denied": Status.PERMISSION_DENIED,
"Repository not found": Status.REPO_NOT_FOUND,
"not a git repository": Status.REPO_NOT_FOUND,
"Could not resolve host": Status.RESOLVE_HOST_FAIL,
"Connection timed out": Status.CONNECT_TIME_OUT,
"Could not read from remote repository.": Status.REPO_NOT_FOUND,
"Authentication failed": Status.AUTH_FAIL,
"could not create work tree": Status.CREATE_WORK_TREE_FAIL,
"already exists and is not an empty directory": Status.DIRECTORY_EXIST,
"The current directory is not a git repository": Status.NOT_REPO,
"couldn't find remote ref": Status.NOT_BRANCH,
"is not a commit and a branch": Status.NOT_BRANCH,
"[rejected]": Status.PUST_REJECT,
"refusing to update": Status.REFUSE_PUST
}
class SYNCException(Exception):
def __init__(self, status: Status):
self.code_status = status.code
self.status_msg = status.msg
class SYNCResponse(BaseModel):
code_status: Optional[int] = 0
data: Optional[Data] = None
msg: Optional[str] = ''
class GITMSGException(Exception):
def __init__(self, status: Status, repo='', branch=''):
self.status = status.code
self.msg = status.msg
# class SYNCResponse(GenericModel, Generic[Data]):
# code_status: int = 200
# data: Optional[Data] = None
# msg: str = ''
# success: bool = True
# finished: bool = True

24
src/common/crawler.py Normal file
View File

@ -0,0 +1,24 @@
import json
import requests
def Fetch(url: str, way: str, query=None, header=None, data=None):
if url == None:
return None
if way == 'Get':
response = requests.get(url=url, params=query, headers=header)
return response.json()
elif way == 'Post':
response = requests.post(
url=url, params=query, headers=header, data=json.dumps(data))
return response.json()
elif way == 'Patch':
response = requests.patch(
url=url, params=query, headers=header, data=json.dumps(data))
return response.json()
elif way == 'Put':
response = requests.put(
url=url, params=query, headers=header, data=json.dumps(data))
return response.json()
else:
return None

265
src/common/gitcode.py Normal file
View File

@ -0,0 +1,265 @@
from .repo import Repo
from .crawler import Fetch
from src.base import config
from src.dao.pull_request import PullRequestDAO
from src.do.pull_request import PullRequestDO
import shlex
import subprocess
from sqlalchemy import text
from src.utils.logger import logger
class Gitcode(Repo):
organization: str
name: str
project: str
pull_request = []
token = config.ACCOUNT['gitcode_token']
def __init__(self, project, organization, name):
super().__init__(project, organization, name)
def fetch_pull_request(self):
url = f"{config.GITHUB_ENV['gitcode_api_address']}/{self.organization}/{self.name}/pulls"
token = self.token
qs = {
'access_token': token}
data = Fetch(url=url, params=qs, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the gitee repo {self.organization}/{self.name} has no pull request")
else:
for pull in data:
pr = PullRequest(self.organization, self.name, pull['number'])
pr.project = self.project
pr.state = 'open'
pr.commit_url = pull['commits_url']
pr.inline = False
pr.comment_url = pull['comments_url']
pr.title = pull['title']
pr.html_url = pull['html_url']
pr.target_branch = pull['base']['ref']
self.pull_request.append(pr)
logger.info(f"fetch the pull request {pr.id} successfully")
async def save_pull_request(self):
if len(self.pull_request) == 0:
logger.info("no pull request need to save")
return
for pr in self.pull_request:
await pr.save()
class PullRequest(Gitcode):
url: str
project: str
html_url: str
author: str
review_url: str
state: str
commit_url: str
inline: bool
comment_url: str
title: str
target_branch: str
latest_commit: str
token = config.ACCOUNT['gitcode_token']
def __init__(self, organization, name: str, id: int):
self.url = f"{config.GITHUB_ENV['github_api_address']}/{organization}/{name}/pulls/{id}"
self.id = id
self.type = 'GitHub'
self.organization = organization
self.name = name
self._pull_request_dao = PullRequestDAO()
@classmethod
def fetch_commit(self):
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.commit_url, header=header, way='Get')
self.author = resp[0]['commit']['author']['name']
self.email = resp[0]['commit']['author']['email']
@classmethod
def fetch_comment_url(self):
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.url, header=header, way='Get')
self.comment_url = resp['comments_url']
@classmethod
def fetch_comment(self):
self.fetch_comment_url()
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.comment_url, header=header, way='Get')
comments = []
for item in resp:
comments.append(item['body'])
return comments
def _clone(self):
dir = "/tmp/" + self.name + "_pr" + str(self.id)
address = f"git@github.com:{self.organization}/{self.name}.git"
subprocess.run(shlex.split('mkdir ' + dir), cwd='.')
subprocess.run(shlex.split('git clone ' + address), cwd=dir)
return dir
def _apply_diff(self, dir, branch: str):
subprocess.run(shlex.split('git checkout ' + branch), cwd=dir)
new_branch = 'pr' + str(self.id)
subprocess.run(shlex.split('git checkout -b ' + branch), cwd=dir)
subprocess.run(shlex.split('git apply ' + dir), cwd=dir)
subprocess.run(shlex.split('git add .'), cwd=dir)
subprocess.run(shlex.split(
"git commit -m '" + self.title + "'"), cwd=dir)
subprocess.run(shlex.split(
'git push -uv origin' + new_branch), cwd=dir)
def _get_diff(self):
filename = "/tmp/github_pr" + str(self.id) + "_diff"
baseUrl = f"{config.GITHUB_ENV['github_api_diff_address']}/{self.organization}/{self.name}/pull/"
diffUrl = baseUrl + str(self.id) + ".diff"
cmd = "curl -X GET " + diffUrl + \
" -H 'Accept: application/vnd.github.v3.diff'"
with open(filename, 'w') as outfile:
subprocess.call(shlex.split(cmd), stdout=outfile)
return filename
def _send_merge_request(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/merge"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {
"merge_method": "squash"
}
resp = Fetch(url, header=header, data=data, way='Put')
if resp is None:
logger.error("send merge request failed")
return False
return True
@classmethod
def comment(self, comment: str):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/comments"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {"body": comment}
resp = Fetch(url, header=header, data=data, way='Post')
if resp is None:
logger.error("send comment request failed")
return False
return True
@classmethod
def approve(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/reviews"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {
"body": "LGTM",
"event": "APPROVE"
}
resp = Fetch(url, header=header, data=data, way='Post')
if resp is None:
logger.error("send approve request failed")
return False
return True
@classmethod
def close(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls/{self.id}"
header = {
'Authorization': 'token ' + self.token,
'Accept': 'application/vnd.github.v3+json'}
data = {"state": "closed"}
resp = Fetch(url, header=header, data=data, way='Patch')
if resp is None:
logger.error("send close pull request failed")
return False
return True
@classmethod
def get_latest_commit(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls/{self.id}/commits"
header = {
'Authorization': 'token ' + self.token}
data = Fetch(url, header=header, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the pull request {self.id} of github repo {self.organization}/{self.name} has no commits")
else:
self.latest_commit = data[0]['sha']
@classmethod
async def save(self):
self.fetch_commit()
count = await self._pull_request_dao._count(PullRequestDO, text(f"pull_request_id = '{self.id}'"))
if count == 0:
# insert pull request repo
ans = await self._pull_request_dao.insert_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch, "NULL")
else:
# update pull request repo
await self._pull_request_dao.update_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch)
ans = True
if not ans:
logger.error(f"save the pull request {self.id} failed")
else:
logger.info(f"save the pull request {self.id} successfully")
return
@classmethod
async def sync(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the github pull request #{self.id} has no comment")
return
merge = False
cicd = False
for comment in comments:
if comment == '/merge':
logger.info(
f"the github pull request #{self.id} need to merge")
merge = True
if comment == '/cicd':
logger.info(f"the github pull request #{self.id} need to cicd")
cicd = True
if cicd:
pass
if merge:
self.merge_to_inter()
return
@classmethod
async def check_if_merge(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the github pull request {self.id} has no comment")
return
merge = False
for comment in comments:
if comment == '/merge':
logger.info(f"the github pull request {self.id} need to merge")
merge = True
return merge
@classmethod
async def check_if_new_commit(self, origin_latest_commit: str):
latest_commit = self.get_latest_commit()
if latest_commit == origin_latest_commit:
return True
else:
return False

258
src/common/gitee.py Normal file
View File

@ -0,0 +1,258 @@
from .repo import Repo
from .crawler import Fetch
from src.base import config
from src.dao.pull_request import PullRequestDAO
from src.do.pull_request import PullRequestDO
import shlex
import subprocess
from sqlalchemy import text
from src.utils.logger import logger
class Gitee(Repo):
organization: str
name: str
project: str
pull_request = []
token = config.ACCOUNT['gitee_token']
def __init__(self, project, organization, name):
super().__init__(project, organization, name)
def fetch_pull_request(self):
url = f"{config.GITEE_ENV['gitee_api_address']}/{self.organization}/{self.name}/pulls"
token = self.token
qs = {
'access_token': token}
data = Fetch(url=url, params=qs, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the gitee repo {self.organization}/{self.name} has no pull request")
else:
for pull in data:
pr = PullRequest(self.organization, self.name, pull['number'])
pr.project = self.project
pr.state = 'open'
pr.commit_url = pull['commits_url']
pr.inline = False
pr.comment_url = pull['_links']['comments']
pr.title = pull['title']
pr.html_url = pull['html_url']
pr.target_branch = pull['base']['ref']
self.pull_request.append(pr)
logger.info(f"fetch the pull request {pr.id} successfully")
async def save_pull_request(self):
if len(self.pull_request) == 0:
logger.info("no pull request need to save")
return
for pr in self.pull_request:
await pr.save()
class PullRequest(Gitee):
url: str
project: str
html_url: str
author: str
review_url: str
state: str
commit_url: str
inline: bool
comment_url: str
title: str
target_branch: str
latest_commit: str
token = config.ACCOUNT['gitee_token']
def __init__(self, organization, name: str, id: int):
self.url = f"{config.GITEE_ENV['gitee_api_address']}/{organization}/{name}/pulls/{id}"
self.id = id
self.type = 'Gitee'
self.organization = organization
self.name = name
self._pull_request_dao = PullRequestDAO()
@classmethod
def fetch_commit(self):
qs = {
'access_token': self.token}
resp = Fetch(self.commit_url, params=qs, way='Get')
self.author = resp[0]['commit']['author']['name']
self.email = resp[0]['commit']['author']['email']
@classmethod
def fetch_comment_url(self):
qs = {
'access_token': self.token}
resp = Fetch(self.url, params=qs, way='Get')
self.comment_url = resp['comments_url']
@classmethod
def fetch_comment(self):
self.fetch_comment_url()
qs = {
'access_token': self.token}
resp = Fetch(self.comment_url, params=qs, way='Get')
comments = []
for item in resp:
comments.append(item['body'])
return comments
def _clone(self):
dir = "/tmp/" + self.name + "_pr" + str(self.id)
address = f"git@gitee.com:{self.organization}/{self.name}.git"
subprocess.run(shlex.split('mkdir ' + dir), cwd='.')
subprocess.run(shlex.split('git clone ' + address), cwd=dir)
return dir
def _apply_diff(self, dir, branch: str):
subprocess.run(shlex.split('git checkout ' + branch), cwd=dir)
new_branch = 'pr' + str(self.id)
subprocess.run(shlex.split('git checkout -b ' + branch), cwd=dir)
subprocess.run(shlex.split('git apply ' + dir), cwd=dir)
subprocess.run(shlex.split('git add .'), cwd=dir)
subprocess.run(shlex.split(
"git commit -m '" + self.title + "'"), cwd=dir)
subprocess.run(shlex.split(
'git push -uv origin' + new_branch), cwd=dir)
def _get_diff(self, dir: str):
filename = "/tmp/gitee_pr" + str(self.id) + "_diff"
baseUrl = f"{config.GITEE_ENV['gitee_api_diff_address']}/{self.organization}/{self.name}/pull/"
diffUrl = baseUrl + str(self.id) + ".diff"
cmd = f"curl -X GET {diffUrl}"
with open(filename, 'w') as outfile:
subprocess.call(shlex.split(cmd), stdout=outfile)
return filename
def _send_merge_request(self):
url = f"{config.GITEE_ENV['gitee_api_address']}/{self.organization}/{self.name}/issues/{self.id}/merge"
qs = {
'access_token': self.token}
data = {
"merge_method": "squash"
}
resp = Fetch(url, params=qs, data=data, way='Put')
if resp is None:
logger.error("send merge request failed")
return False
return True
@classmethod
def comment(self, comment: str):
url = f"{config.GITEE_ENV['gitee_api_address']}/{self.organization}/{self.name}/issues/{self.id}/comments"
qs = {
'access_token': self.token}
data = {"body": comment}
resp = Fetch(url, params=qs, data=data, way='Post')
if resp is None:
logger.error("send comment request failed")
return False
return True
@classmethod
def approve(self):
url = f"{config.GITEE_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/reviews"
qs = {
'access_token': self.token}
data = {"body": "LGTM",
"event": "APPROVE"}
resp = Fetch(url, params=qs, data=data, way='Post')
if resp is None:
logger.error("send approve request failed")
return False
return True
@classmethod
def close(self):
url = f"{config.GITEE_ENV['gitee_api_address']}/{self.organization}/{self.name}/pulls/{self.id}"
qs = {
'access_token': self.token}
data = {"state": "closed"}
resp = Fetch(url, params=qs, data=data, way='Patch')
if resp is None:
logger.error("send close pull request failed")
return False
return True
@classmethod
def get_latest_commit(self):
url = f"{config.GITEE_ENV['gitee_api_address']}/{self.organization}/{self.name}/pulls/{self.id}/commits"
qs = {
'access_token': self.token}
data = Fetch(url, params=qs, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the pull request {self.id} of gitee repo {self.organization}/{self.name} has no commits")
else:
self.latest_commit = data[0]['sha']
@classmethod
async def save(self):
self.fetch_commit()
count = await self._pull_request_dao._count(PullRequestDO, text(f"pull_request_id = '{self.id}'"))
if count == 0:
# insert pull request repo
ans = await self._pull_request_dao.insert_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch, "NULL")
else:
# update pull request repo
await self._pull_request_dao.update_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch)
ans = True
if not ans:
logger.error(f"save the pull request {self.id} failed")
else:
logger.info(f"save the pull request {self.id} successfully")
return
@classmethod
async def sync(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the gitee pull request #{self.id} has no comment")
return
merge = False
cicd = False
for comment in comments:
if comment == '/merge':
logger.info(
f"the gitee pull request #{self.id} need to merge")
merge = True
if comment == '/cicd':
logger.info(f"the gitee pull request #{self.id} need to cicd")
cicd = True
if cicd:
pass
if merge:
self.merge_to_inter()
return
@classmethod
async def check_if_merge(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the gitee pull request {self.id} has no comment")
return
merge = False
for comment in comments:
if comment == '/merge':
logger.info(f"the gitee pull request {self.id} need to merge")
merge = True
return merge
@classmethod
async def check_if_new_commit(self, origin_latest_commit: str):
latest_commit = self.get_latest_commit()
if latest_commit == origin_latest_commit:
return True
else:
return False

257
src/common/github.py Normal file
View File

@ -0,0 +1,257 @@
from src.dto.sync import Project
from .repo import Repo
from .crawler import Fetch
from src.base import config
from src.dao.pull_request import PullRequestDAO
import shlex
import subprocess
from sqlalchemy import text
from src.utils.logger import logger
class Github(Repo):
organization: str
name: str
project: str
pull_request_list = []
token = config.ACCOUNT['github_token']
def __init__(self, project, organization, name: str):
super().__init__(project, organization, name)
def fetch_pull_request(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls"
token = self.token
header = {
'Authorization': 'token ' + token}
data = Fetch(url, header=header, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the github repo {self.organization}/{self.name} has no pull request")
else:
for pull in data:
pr = PullRequest(self.organization, self.name, pull['number'])
pr.project = self.project
pr.state = 'open'
pr.commit_url = pull['commits_url']
pr.inline = False
pr.comment_url = pull['comments_url']
pr.title = pull['title']
pr.html_url = pull['html_url']
pr.target_branch = pull['base']['ref']
self.pull_request_list.append(pr)
logger.info(f"fetch the pull request {pr.id} successfully")
async def save_pull_request(self):
if len(self.pull_request_list) == 0:
logger.info("no pull request need to save")
return
for pr in self.pull_request_list:
await pr.save()
class PullRequest(Github):
id: int
url: str
project: str
html_url: str
author: str
review_url: str
state: str
commit_url: str
inline: bool
comment_url: str
title: str
target_branch: str
latest_commit: str
token = config.ACCOUNT['github_token']
def __init__(self, organization: str, name: str, id: int):
self.url = f"{config.GITHUB_ENV['github_api_address']}/{organization}/{name}/pulls/{id}"
self.id = id
self.type = 'GitHub'
self.organization = organization
self.name = name
self._pull_request_dao = PullRequestDAO()
def fetch_commit(self):
logger.info(self.id)
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.commit_url, header=header, way='Get')
self.author = resp[0]['commit']['author']['name']
self.email = resp[0]['commit']['author']['email']
def fetch_comment_url(self):
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.url, header=header, way='Get')
self.comment_url = resp['comments_url']
def fetch_comment(self):
self.fetch_comment_url()
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.comment_url, header=header, way='Get')
comments = []
for item in resp:
comments.append(item['body'])
return comments
def _clone(self):
dir = "/tmp/" + self.name + "_pr" + str(self.id)
address = f"git@github.com:{self.organization}/{self.name}.git"
subprocess.run(shlex.split('mkdir ' + dir), cwd='.')
subprocess.run(shlex.split('git clone ' + address), cwd=dir)
return dir
def _apply_diff(self, dir, branch: str):
subprocess.run(shlex.split('git checkout ' + branch), cwd=dir)
new_branch = 'pr' + str(self.id)
subprocess.run(shlex.split('git checkout -b ' + branch), cwd=dir)
subprocess.run(shlex.split('git apply ' + dir), cwd=dir)
subprocess.run(shlex.split('git add .'), cwd=dir)
subprocess.run(shlex.split(
"git commit -m '" + self.title + "'"), cwd=dir)
subprocess.run(shlex.split(
'git push -uv origin' + new_branch), cwd=dir)
def _get_diff(self):
filename = "/tmp/github_pr" + str(self.id) + "_diff"
baseUrl = f"{config.GITHUB_ENV['github_api_diff_address']}/{self.organization}/{self.name}/pull/"
diffUrl = baseUrl + str(self.id) + ".diff"
cmd = "curl -X GET " + diffUrl + \
" -H 'Accept: application/vnd.github.v3.diff'"
with open(filename, 'w') as outfile:
subprocess.call(shlex.split(cmd), stdout=outfile)
return filename
def _send_merge_request(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/merge"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {
"merge_method": "squash"
}
resp = Fetch(url, header=header, data=data, way='Put')
if resp is None:
logger.error("send merge request failed")
return False
return True
def comment(self, comment: str):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/comments"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {"body": comment}
resp = Fetch(url, header=header, data=data, way='Post')
if resp is None:
logger.error("send comment request failed")
return False
return True
def approve(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/reviews"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {"body": "LGTM",
"event": "APPROVE"}
resp = Fetch(url, header=header, data=data, way='Post')
if resp is None:
logger.error("send approve request failed")
return False
return True
def close(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls/{self.id}"
header = {
'Authorization': 'token ' + self.token,
'Accept': 'application/vnd.github.v3+json'}
data = {"state": "closed"}
resp = Fetch(url, header=header, data=data, way='Patch')
if resp is None:
logger.error("send close pull request failed")
return False
return True
def get_latest_commit(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls/{self.id}/commits"
header = {
'Authorization': 'token ' + self.token}
data = Fetch(url, header=header, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the pull request {self.id} of github repo {self.organization}/{self.name} has no commits")
else:
self.latest_commit = data[0]['sha']
async def save(self):
self.fetch_commit()
pr = await self._pull_request_dao.fetch(text(f"pull_request_id = {self.id} and project = '{self.project}'"))
if not pr:
# insert pull request repo
resp = await self._pull_request_dao.insert_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch, "NULL")
if not resp:
logger.error(f"save the pull request {self.id} failed")
else:
logger.info(f"save the pull request {self.id} successfully")
else:
# update pull request repo
resp = await self._pull_request_dao.update_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch)
if not resp:
logger.error(f"update the pull request {self.id} failed")
else:
logger.info(f"update the pull request {self.id} successfully")
return
async def sync(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the github pull request #{self.id} has no comment")
return
merge = False
cicd = False
for comment in comments:
if comment == '/merge':
logger.info(
f"the github pull request #{self.id} need to merge")
merge = True
if comment == '/cicd':
logger.info(f"the github pull request #{self.id} need to cicd")
cicd = True
if cicd:
pass
if merge:
self.merge_to_inter()
return
async def check_if_merge(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the github pull request {self.id} has no comment")
return
merge = False
for comment in comments:
if comment == '/merge':
logger.info(f"the github pull request {self.id} need to merge")
merge = True
return merge
async def check_if_new_commit(self, origin_latest_commit: str):
latest_commit = self.get_latest_commit()
if latest_commit == origin_latest_commit:
return True
else:
return False

263
src/common/gitlink.py Normal file
View File

@ -0,0 +1,263 @@
from .repo import Repo
from .crawler import Fetch
from src.base import config
from src.dao.pull_request import PullRequestDAO
from src.do.pull_request import PullRequestDO
import shlex
import subprocess
from sqlalchemy import text
from src.utils.logger import logger
class Gitlink(Repo):
organization: str
name: str
project: str
pull_request = []
token = config.ACCOUNT['github_token']
def __init__(self, project, organization, name):
super().__init__(project, organization, name)
def fetch_pull_request(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls"
token = self.token
header = {
'Authorization': 'token ' + token}
data = Fetch(url, header=header, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the github repo {self.organization}/{self.name} has no pull request")
else:
for pull in data:
pr = PullRequest(self.organization, self.name, pull['number'])
pr.project = self.project
pr.state = 'open'
pr.commit_url = pull['commits_url']
pr.inline = False
pr.comment_url = pull['comments_url']
pr.title = pull['title']
pr.html_url = pull['html_url']
pr.target_branch = pull['base']['ref']
self.pull_request.append(pr)
logger.info(f"fetch the pull request {pr.id} successfully")
async def save_pull_request(self):
if len(self.pull_request) == 0:
logger.info("no pull request need to save")
return
for pr in self.pull_request:
await pr.save()
class PullRequest(Gitlink):
url: str
project: str
html_url: str
author: str
review_url: str
state: str
commit_url: str
inline: bool
comment_url: str
title: str
target_branch: str
latest_commit: str
token = config.ACCOUNT['github_token']
def __init__(self, organization, name: str, id: int):
self.url = f"{config.GITHUB_ENV['github_api_address']}/{organization}/{name}/pulls/{id}"
self.id = id
self.type = 'GitHub'
self.organization = organization
self.name = name
self._pull_request_dao = PullRequestDAO()
@classmethod
def fetch_commit(self):
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.commit_url, header=header, way='Get')
self.author = resp[0]['commit']['author']['name']
self.email = resp[0]['commit']['author']['email']
@classmethod
def fetch_comment_url(self):
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.url, header=header, way='Get')
self.comment_url = resp['comments_url']
@classmethod
def fetch_comment(self):
self.fetch_comment_url()
header = {
'Authorization': 'token ' + self.token}
resp = Fetch(self.comment_url, header=header, way='Get')
comments = []
for item in resp:
comments.append(item['body'])
return comments
def _clone(self):
dir = "/tmp/" + self.name + "_pr" + str(self.id)
address = f"git@github.com:{self.organization}/{self.name}.git"
subprocess.run(shlex.split('mkdir ' + dir), cwd='.')
subprocess.run(shlex.split('git clone ' + address), cwd=dir)
return dir
def _apply_diff(self, dir, branch: str):
subprocess.run(shlex.split('git checkout ' + branch), cwd=dir)
new_branch = 'pr' + str(self.id)
subprocess.run(shlex.split('git checkout -b ' + branch), cwd=dir)
subprocess.run(shlex.split('git apply ' + dir), cwd=dir)
subprocess.run(shlex.split('git add .'), cwd=dir)
subprocess.run(shlex.split(
"git commit -m '" + self.title + "'"), cwd=dir)
subprocess.run(shlex.split(
'git push -uv origin' + new_branch), cwd=dir)
def _get_diff(self):
filename = "/tmp/github_pr" + str(self.id) + "_diff"
baseUrl = f"{config.GITHUB_ENV['github_api_diff_address']}/{self.organization}/{self.name}/pull/"
diffUrl = baseUrl + str(self.id) + ".diff"
cmd = "curl -X GET " + diffUrl + \
" -H 'Accept: application/vnd.github.v3.diff'"
with open(filename, 'w') as outfile:
subprocess.call(shlex.split(cmd), stdout=outfile)
return filename
def _send_merge_request(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/merge"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {
"merge_method": "squash"
}
resp = Fetch(url, header=header, data=data, way='Put')
if resp is None:
logger.error("send merge request failed")
return False
return True
@classmethod
def comment(self, comment: str):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/comments"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {"body": comment}
resp = Fetch(url, header=header, data=data, way='Post')
if resp is None:
logger.error("send comment request failed")
return False
return True
@classmethod
def approve(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/issues/{self.id}/reviews"
header = {
'Authorization': 'token ' + self.token,
'Content-Type': 'application/json'}
data = {"body": "LGTM",
"event": "APPROVE"}
resp = Fetch(url, header=header, data=data, way='Post')
if resp is None:
logger.error("send approve request failed")
return False
return True
@classmethod
def close(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls/{self.id}"
header = {
'Authorization': 'token ' + self.token,
'Accept': 'application/vnd.github.v3+json'}
data = {"state": "closed"}
resp = Fetch(url, header=header, data=data, way='Patch')
if resp is None:
logger.error("send close pull request failed")
return False
return True
@classmethod
def get_latest_commit(self):
url = f"{config.GITHUB_ENV['github_api_address']}/{self.organization}/{self.name}/pulls/{self.id}/commits"
header = {
'Authorization': 'token ' + self.token}
data = Fetch(url, header=header, way='Get')
if data is None or len(data) == 0:
logger.info(
f"the pull request {self.id} of github repo {self.organization}/{self.name} has no commits")
else:
self.latest_commit = data[0]['sha']
@classmethod
async def save(self):
self.fetch_commit()
count = await self._pull_request_dao._count(PullRequestDO, text(f"pull_request_id = '{self.id}'"))
if count == 0:
# insert pull request repo
ans = await self._pull_request_dao.insert_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch, "NULL")
else:
# update pull request repo
await self._pull_request_dao.update_pull_request(
self.id, self.title, self.project, self.type, self.html_url,
self.author, self.email, self.target_branch)
ans = True
if not ans:
logger.error(f"save the pull request {self.id} failed")
else:
logger.info(f"save the pull request {self.id} successfully")
return
@classmethod
async def sync(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the github pull request #{self.id} has no comment")
return
merge = False
cicd = False
for comment in comments:
if comment == '/merge':
logger.info(
f"the github pull request #{self.id} need to merge")
merge = True
if comment == '/cicd':
logger.info(f"the github pull request #{self.id} need to cicd")
cicd = True
if cicd:
pass
if merge:
self.merge_to_inter()
return
@classmethod
async def check_if_merge(self):
comments = self.fetch_comment()
if len(comments) == 0:
logger.info(f"the github pull request {self.id} has no comment")
return
merge = False
for comment in comments:
if comment == '/merge':
logger.info(f"the github pull request {self.id} need to merge")
merge = True
return merge
@classmethod
async def check_if_new_commit(self, origin_latest_commit: str):
latest_commit = self.get_latest_commit()
if latest_commit == origin_latest_commit:
return True
else:
return False

14
src/common/repo.py Normal file
View File

@ -0,0 +1,14 @@
class Repo(object):
def __init__(self, project, organization, name):
self.project = project
self.organization = organization
self.name = name
class RepoType:
Github = 'GitHub'
Gitee = 'Gitee'
Gitlab = 'Gitlab'
Gitcode = 'Gitcode'
Gitlink = 'Gitlink'

View File

@ -0,0 +1,35 @@
from src.common.github import Github
from src.common.github import PullRequest as GithubPullRequest
from src.common.gitee import Gitee, PullRequest as GiteePullRequest
from src.common.gitcode import Gitcode, PullRequest as GitcodePullRequest
from src.common.gitlink import Gitlink, PullRequest as GitlinkPullRequest
class RepoFactory(object):
@staticmethod
def create(type, project: str, organization: str, name: str):
if type == 'Github':
return Github(project, organization, name)
elif type == 'Gitee':
return Gitee(project, organization, name)
elif type == 'Gitcode':
return Gitcode(project, organization, name)
elif type == 'Gitlink':
return Gitlink(project, organization, name)
else:
return None
class PullRequestFactory(object):
@staticmethod
def create(type: str, organization: str, name: str, id: int):
if type == 'Github':
return GithubPullRequest(organization, name, id)
elif type == 'Gitee':
return GiteePullRequest(organization, name, id)
elif type == 'Gitcode':
return GitcodePullRequest(organization, name, id)
elif type == 'Gitlink':
return GitlinkPullRequest(organization, name, id)
else:
return None

92
src/dao/account.py Normal file
View File

@ -0,0 +1,92 @@
from typing import Type, List, Optional
from sqlalchemy import select, update
from src.dto.account import CreateAccountItem, UpdateAccountItem
from .mysql_ao import MysqlAO
from src.do.account import GithubAccountDO, GiteeAccountDO
from typing import Any
from sqlalchemy import select, func, text
class AccountDAO(MysqlAO):
_DO_class = None
async def insert_githubAccount(self, domain, nickname, account, email: str) -> Optional[List[GithubAccountDO]]:
data = {
'domain': domain,
'nickname': nickname,
'account': account,
'email': email,
"create_time": await self.get_now_time(),
"update_time": await self.get_now_time()
}
return await self._insert_one(self._DO_class(**data))
async def insert_giteeAccount(self, domain, nickname, account, email: str) -> Optional[List[GiteeAccountDO]]:
data = {
'domain': domain,
'nickname': nickname,
'account': account,
'email': email,
"create_time": await self.get_now_time(),
"update_time": await self.get_now_time()
}
return await self._insert_one(self._DO_class(**data))
async def update_account(self, item: UpdateAccountItem) -> bool:
stmt = update(self._DO_class).where(
self._DO_class.id == item.id).values(
domain=item.domain,
nickname=item.nickname,
account=item.account,
email=item.email,
update_time=await self.get_now_time()
)
print(stmt)
async with self._async_session() as session:
async with session.begin():
return (await session.execute(stmt)).rowcount > 0
async def fetch(self, cond: Any = None, limit: int = 0, start: int = 0) -> List[_DO_class]:
stmt = select(self._DO_class).order_by(
self._DO_class.update_time.desc())
if cond is not None:
stmt = stmt.where(cond)
if limit:
stmt = stmt.limit(limit)
if start:
stmt = stmt.offset(start)
async with self._async_session() as session:
answer = list((await session.execute(stmt)).all())
return answer
async def delete_account(self, id: int) -> bool:
async with self._async_session() as session:
async with session.begin():
account = await session.get(self._DO_class, id)
if account:
try:
await session.delete(account)
return True
except:
pass
return False
async def _count(self, cond: Any = None) -> int:
cond = text(cond) if isinstance(cond, str) else cond
async with self._async_session() as session:
if cond is not None:
stmt = select(func.count(self._DO_class.domain)).where(cond)
else:
stmt = select(func.count(self._DO_class.domain))
return ((await session.execute(stmt)).scalar_one())
class GithubAccountDAO(AccountDAO):
_DO_class = GithubAccountDO
class GiteeAccountDAO(AccountDAO):
_DO_class = GiteeAccountDO

44
src/dao/log.py Normal file
View File

@ -0,0 +1,44 @@
from typing import List, Optional
from sqlalchemy import select, update, text, delete
from .mysql_ao import MysqlAO
from src.do.log import LogDO
from typing import Any
class LogDAO(MysqlAO):
_DO_class = LogDO
async def insert_log(self, id, type, msg) -> Optional[List[LogDO]]:
data = {
'sync_job_id': id,
'log_type': type,
'log': msg,
"create_time": await self.get_now_time()
}
return await self._insert_one(self._DO_class(**data))
async def fetch(self, cond: Any = None, limit: int = 0, start: int = 0) -> List[LogDO]:
stmt = select(self._DO_class).order_by(
self._DO_class.id.desc())
if cond is not None:
stmt = stmt.where(cond)
if limit:
stmt = stmt.limit(limit)
if start:
stmt = stmt.offset(start)
async with self._async_session() as session:
answer = list((await session.execute(stmt)).all())
return answer
async def delete_log(self, cond: Any = None) -> Optional[bool]:
stmt = delete(self._DO_class)
if cond is not None:
stmt = stmt.where(cond)
async with self._async_session() as session:
await session.execute(stmt)
return
async def _get_count(self, cond) -> int:
return await self._count(self._DO_class, cond)

90
src/dao/mysql_ao.py Normal file
View File

@ -0,0 +1,90 @@
from typing import Type, List, Optional, Union
from sqlalchemy.dialects.mysql import insert
from extras.obfastapi.mysql import aiomysql_session, ORMAsyncSession
from src.do.data_object import DataObject
from src.base import config # 加载配置
from extras.obfastapi.frame import Logger
from sqlalchemy import select, func, text
from datetime import datetime
from typing import Any
class MysqlAO:
def __init__(self, key: str = config.DB_ENV):
self._db_key = key
self.now_time = None
self.now_time_stamp = 0
def _async_session(self) -> ORMAsyncSession:
return aiomysql_session(self._db_key)
async def _insert_all(self, do: List[DataObject]) -> Optional[List[DataObject]]:
async with self._async_session() as session:
async with session.begin():
try:
session.add_all(do)
await session.flush()
return do
except:
await session.rollback()
return None
async def _insert_one(self, do: DataObject) -> Optional[DataObject]:
async with self._async_session() as session:
async with session.begin():
try:
session.add(do)
await session.flush()
return do
except Exception as e:
await session.rollback()
return None
async def _delete_one(self, clz: Type[DataObject], _id: Union[int, str]) -> Optional[bool]:
async with self._async_session() as session:
async with session.begin():
data = await session.get(clz, _id)
if data:
try:
await session.delete(data)
return True
except:
await session.rollback()
return None
async def _count(self, clz: Type[DataObject], cond: Any = None) -> int:
field = clz.emp_id if hasattr(clz, 'emp_id') else clz.id
cond = text(cond) if isinstance(cond, str) else cond
async with self._async_session() as session:
if cond is not None:
stmt = select(func.count(field)).where(cond)
else:
stmt = select(func.count(field))
return ((await session.execute(stmt)).scalar_one())
async def _insert_on_duplicate_key_update(self, clz: Type[DataObject], data: dict, on_duplicate_key_update: list) -> bool:
update_data = {}
insert_stmt = insert(clz).values(**data)
for col in on_duplicate_key_update:
if col not in data:
continue
update_data[col] = getattr(insert_stmt.inserted, col)
async with self._async_session() as session:
return (await session.execute(insert_stmt.on_duplicate_key_update(**update_data))).rowcount > 0
async def get_now_time(self, cache: bool = True) -> datetime:
if not cache or not self.now_time:
async with self._async_session() as session:
async with session.begin():
self.now_time = (await session.execute("SELECT now() now")).scalar_one()
return self.now_time
async def get_now_time_stamp(self, cache: bool = True) -> int:
if not cache or not self.now_time_stamp:
async with self._async_session() as session:
async with session.begin():
self.now_time_stamp = (await session.execute("SELECT unix_timestamp(now()) now")).scalar_one()
return self.now_time_stamp

78
src/dao/pull_request.py Normal file
View File

@ -0,0 +1,78 @@
from typing import List, Optional
from sqlalchemy import select, update, text
from .mysql_ao import MysqlAO
from src.do.pull_request import PullRequestDO
from typing import Any
class PullRequestDAO(MysqlAO):
_DO_class = PullRequestDO
async def insert_pull_request(self, id: int, title, project, repo_type, address, author, email,
target_branch, latest_commit: str) -> Optional[List[PullRequestDO]]:
data = {
'pull_request_id': id,
'project': project,
'title': title,
'type': repo_type,
'address': address,
'author': author,
'email': email,
'target_branch': target_branch,
'latest_commit': latest_commit,
'inline': False,
"create_time": await self.get_now_time(),
"update_time": await self.get_now_time()
}
return await self._insert_one(self._DO_class(**data))
async def fetch(self, cond: Any = None, limit: int = 0, start: int = 0) -> List[_DO_class]:
stmt = select(self._DO_class).order_by(
self._DO_class.update_time.desc())
if cond is not None:
stmt = stmt.where(cond)
if limit:
stmt = stmt.limit(limit)
if start:
stmt = stmt.offset(start)
async with self._async_session() as session:
answer = list((await session.execute(stmt)).all())
return answer
async def delete_pull_request(self, emp_id: str) -> Optional[bool]:
return await self._delete_one(self._DO_class, emp_id)
async def update_pull_request(self, id, title, project, repo_type, address, author, email, target_branch: str) -> bool:
update_time = await self.get_now_time()
stmt = update(self._DO_class).where(self._DO_class.pull_request_id == id and self._DO_class.project == project).values(
title=title,
project=project,
type=repo_type,
address=address,
author=author,
email=email,
target_branch=target_branch,
update_time=update_time
)
async with self._async_session() as session:
return (await session.execute(stmt)).rowcount > 0
async def update_latest_commit(self, project, id, latest_commit) -> bool:
update_time = await self.get_now_time()
stmt = update(self._DO_class).where(self._DO_class.pull_request_id == id and self._DO_class.project == project).values(
update_time=update_time,
latest_commit=latest_commit
)
async with self._async_session() as session:
return (await session.execute(stmt)).rowcount > 0
async def update_inline_status(self, project, id, inline) -> bool:
update_time = await self.get_now_time()
stmt = update(self._DO_class).where(
self._DO_class.pull_request_id == id and self._DO_class.project == project).values(
inline=inline,
update_time=update_time)
async with self._async_session() as session:
return (await session.execute(stmt)).rowcount > 0

108
src/dao/sync.py Normal file
View File

@ -0,0 +1,108 @@
from typing import List, Optional
from sqlalchemy import select, update, text
from .mysql_ao import MysqlAO
from src.do.sync import ProjectDO, JobDO
from typing import Any
from src.dto.sync import CreateJobItem, CreateProjectItem
class ProjectDAO(MysqlAO):
_DO_class = ProjectDO
async def insert_project(self, item: CreateProjectItem) -> Optional[List[ProjectDO]]:
data = {
'name': item.name,
'github': item.github_address,
'gitee': item.gitee_address,
'gitlab': item.gitlab_address,
'code_china': item.code_china_address,
'gitlink': item.gitlink_address,
'github_token': item.github_token,
'gitee_token': item.gitee_token,
"create_time": await self.get_now_time(),
"update_time": await self.get_now_time()
}
return await self._insert_one(self._DO_class(**data))
async def fetch(self, cond: Any = None, limit: int = 0, start: int = 0) -> List[ProjectDO]:
stmt = select(self._DO_class).order_by(
self._DO_class.update_time.desc())
if cond is not None:
stmt = stmt.where(cond)
if limit:
stmt = stmt.limit(limit)
if start:
stmt = stmt.offset(start)
async with self._async_session() as session:
answer = list((await session.execute(stmt)).all())
return answer
async def delete_project(self, emp_id: str) -> Optional[bool]:
return await self._delete_one(self._DO_class, emp_id)
async def _get_count(self, cond) -> int:
return await self._count(self._DO_class, cond)
class JobDAO(MysqlAO):
_DO_class = JobDO
async def insert_job(self, project, item: CreateJobItem) -> Optional[List[JobDO]]:
data = {
'project': project,
'type': item.type,
'status': True,
'github_branch': item.github_branch,
'gitee_branch': item.gitee_branch,
'gitlab_branch': item.gitlab_branch,
'code_china_branch': item.code_china_branch,
'gitlink_branch': item.gitlink_branch,
"create_time": await self.get_now_time(),
"update_time": await self.get_now_time(),
"commit": 'no_commit',
"base": item.base
}
return await self._insert_one(self._DO_class(**data))
async def fetch(self, cond: Any = None, limit: int = 0, start: int = 0) -> List[_DO_class]:
stmt = select(self._DO_class).order_by(
self._DO_class.update_time.desc())
if cond is not None:
stmt = stmt.where(cond)
if limit:
stmt = stmt.limit(limit)
if start:
stmt = stmt.offset(start)
async with self._async_session() as session:
answer = list((await session.execute(stmt)).all())
return answer
async def list_all(self) -> List[_DO_class]:
stmt = select(self._DO_class).order_by(self._DO_class.id.desc())
async with self._async_session() as session:
answer = list((await session.execute(stmt)).all())
return answer
async def delete_job(self, emp_id: str) -> Optional[bool]:
return await self._delete_one(self._DO_class, emp_id)
async def update_status(self, _id: int, _status: bool) -> bool:
status = False
if _status:
status = True
stmt = update(self._DO_class).where(
self._DO_class.id == _id).values(status=status)
async with self._async_session() as session:
async with session.begin():
return (await session.execute(stmt)).rowcount > 0
async def update_commit(self, _id: int, _commit: str) -> bool:
if _commit:
stmt = update(self._DO_class).where(
self._DO_class.id == _id).values(commit=_commit)
async with self._async_session() as session:
async with session.begin():
return (await session.execute(stmt)).rowcount > 0

268
src/dao/sync_config.py Normal file
View File

@ -0,0 +1,268 @@
from sqlalchemy import select, update, func
from sqlalchemy.exc import NoResultFound
from src.do.sync_config import SyncBranchMapping, SyncRepoMapping, LogDO
from .mysql_ao import MysqlAO
from src.utils.base import Singleton
from src.dto.sync_config import AllRepoDTO, GetBranchDTO, SyncRepoDTO, SyncBranchDTO, RepoDTO
from typing import List
from src.do.sync_config import SyncDirect, SyncType
class BaseDAO(MysqlAO):
def __init__(self, model_cls, *args, **kwargs):
self.model_cls = model_cls
super().__init__(*args, **kwargs)
async def get(self, **kwargs):
async with self._async_session() as session:
async with session.begin():
stmt = select(self.model_cls).filter_by(**kwargs)
try:
result = await session.execute(stmt)
instance = result.scalar_one()
return instance
except NoResultFound:
return None
async def create(self, **kwargs):
async with self._async_session() as session:
async with session.begin():
instance = self.model_cls(**kwargs)
session.add(instance)
await session.commit()
return instance
async def filter(self, **kwargs):
async with self._async_session() as session:
async with session.begin():
query = select(self.model_cls)
to_del_keys = []
for key, value in kwargs.items():
if isinstance(key, str) and "__contains" in key:
query = query.filter(self.model_cls.__dict__[key[:-10]].like(f"%{value}%"))
to_del_keys.append(key)
elif isinstance(key, str) and key.endswith("__in"):
query = query.filter(self.model_cls.__dict__[key[:-4]].in_(value))
to_del_keys.append(key)
for key in to_del_keys:
del kwargs[key]
stmt = query.filter_by(**kwargs).order_by(self.model_cls.id.desc())
result = await session.execute(stmt)
return result.scalars().all()
async def all(self):
async with self._async_session() as session:
async with session.begin():
stmt = select(self.model_cls).order_by(self.model_cls.id.desc())
result = await session.execute(stmt)
instances = result.scalars().all()
return instances
async def delete(self, instance):
async with self._async_session() as session:
async with session.begin():
await session.delete(instance)
await session.commit()
async def update(self, instance, **kwargs):
async with self._async_session() as session:
async with session.begin():
merged_instance = await session.merge(instance)
for attr, value in kwargs.items():
setattr(merged_instance, attr, value)
await session.commit()
await session.refresh(merged_instance)
return merged_instance
async def values_list(self, *fields):
async with self._async_session() as session:
async with session.begin():
query = select(self.model_cls)
if fields:
query = query.with_entities(*fields)
result = await session.execute(query)
rows = result.fetchall()
if len(fields) == 1:
return [row[0] for row in rows]
else:
return [tuple(row) for row in rows]
class SyncRepoDAO(BaseDAO, metaclass=Singleton):
def __init__(self, *args, **kwargs):
super().__init__(SyncRepoMapping, *args, **kwargs)
async def create_repo(self, dto: SyncRepoDTO) -> RepoDTO:
async with self._async_session() as session:
async with session.begin():
dto.sync_granularity = SyncType(dto.sync_granularity)
dto.sync_direction = SyncDirect(dto.sync_direction)
do = SyncRepoMapping(**dto.dict())
session.add(do)
await session.flush()
data = RepoDTO(
enable=do.enable,
repo_name=do.repo_name,
internal_repo_address=do.internal_repo_address,
external_repo_address=do.external_repo_address,
sync_granularity=do.sync_granularity.name,
sync_direction=do.sync_direction.name
)
await session.commit()
return data
async def get_sync_repo(self, page_number: int, page_size: int, create_sort: bool) -> List[AllRepoDTO]:
async with self._async_session() as session:
async with session.begin():
stmt = select(SyncRepoMapping)
create_order = SyncRepoMapping.created_at if create_sort else SyncRepoMapping.created_at.desc()
stmt = stmt.order_by(create_order).offset((page_number - 1) * page_size).limit(page_size)
do_list: List[SyncRepoMapping] = (await session.execute(stmt)).scalars().all()
datas = []
for do in do_list:
data = AllRepoDTO(
id=do.id,
enable=do.enable,
repo_name=do.repo_name,
internal_repo_address=do.internal_repo_address,
external_repo_address=do.external_repo_address,
sync_granularity=do.sync_granularity.name,
sync_direction=do.sync_direction.name,
created_at=str(do.created_at)
)
datas.append(data)
return datas
class SyncBranchDAO(BaseDAO, metaclass=Singleton):
def __init__(self, *args, **kwargs):
super().__init__(SyncBranchMapping, *args, **kwargs)
async def create_branch(self, dto: SyncBranchDTO, repo_id: int) -> SyncBranchDTO:
async with self._async_session() as session:
async with session.begin():
do = SyncBranchMapping(**dto.dict(), repo_id=repo_id)
session.add(do)
data = SyncBranchDTO(
enable=do.enable,
internal_branch_name=do.internal_branch_name,
external_branch_name=do.external_branch_name
)
await session.commit()
return data
async def get_sync_branch(self, repo_id: int, page_number: int, page_size: int, create_sort: bool) -> List[GetBranchDTO]:
async with self._async_session() as session:
async with session.begin():
stmt = select(SyncBranchMapping).where(SyncBranchMapping.repo_id == repo_id)
create_order = SyncBranchMapping.created_at if create_sort else SyncBranchMapping.created_at.desc()
stmt = stmt.order_by(create_order).offset((page_number - 1) * page_size).limit(page_size)
do_list: List[SyncBranchMapping] = (await session.execute(stmt)).scalars().all()
datas = []
for do in do_list:
data = GetBranchDTO(
id=do.id,
enable=do.enable,
internal_branch_name=do.internal_branch_name,
external_branch_name=do.external_branch_name,
created_at=str(do.created_at)
)
datas.append(data)
return datas
async def sync_branch(self, repo_id: int) -> List[GetBranchDTO]:
async with self._async_session() as session:
async with session.begin():
stmt = select(SyncBranchMapping).where(SyncBranchMapping.repo_id == repo_id,
SyncBranchMapping.enable == 1)
do_list: List[SyncBranchMapping] = (await session.execute(stmt)).scalars().all()
datas = []
for do in do_list:
data = GetBranchDTO(
id=do.id,
enable=do.enable,
created_at=str(do.created_at),
internal_branch_name=do.internal_branch_name,
external_branch_name=do.external_branch_name
)
datas.append(data)
return datas
async def get_branch(self, repo_id: int, branch_name: str, dire: SyncDirect) -> List[GetBranchDTO]:
async with self._async_session() as session:
async with session.begin():
if dire == SyncDirect.to_outer:
stmt = select(SyncBranchMapping).where(SyncBranchMapping.repo_id == repo_id,
SyncBranchMapping.enable.is_(True),
SyncBranchMapping.internal_branch_name == branch_name)
else:
stmt = select(SyncBranchMapping).where(SyncBranchMapping.repo_id == repo_id,
SyncBranchMapping.enable.is_(True),
SyncBranchMapping.external_branch_name == branch_name)
do_list: List[SyncBranchMapping] = (await session.execute(stmt)).scalars().all()
datas = []
for do in do_list:
data = GetBranchDTO(
id=do.id,
enable=do.enable,
created_at=str(do.created_at),
internal_branch_name=do.internal_branch_name,
external_branch_name=do.external_branch_name
)
datas.append(data)
return datas
class LogDAO(BaseDAO, metaclass=Singleton):
def __init__(self, *args, **kwargs):
super().__init__(LogDO, *args, **kwargs)
async def init_sync_repo_log(self, repo_name, direct, log_content):
async with self._async_session() as session:
async with session.begin():
do = LogDO(repo_name=repo_name, sync_direct=direct, log=log_content)
session.add(do)
await session.commit()
async def update_sync_repo_log(self, repo_name, direct, log_content):
async with self._async_session() as session:
async with session.begin():
stmt = update(LogDO).where(LogDO.repo_name == repo_name,
LogDO.branch_id.is_(None), LogDO.commit_id.is_(None)).\
values(
sync_direct=direct,
log=log_content,
# log_history=func.CONCAT(LogDO.log_history, log_content),
update_at=func.now()
)
await session.execute(stmt)
await session.commit()
async def init_branch_log(self, repo_name, direct, branch_id, commit_id, log_content):
async with self._async_session() as session:
async with session.begin():
do = LogDO(repo_name=repo_name, sync_direct=direct, branch_id=branch_id,
commit_id=commit_id, log=log_content)
session.add(do)
await session.commit()
async def update_branch_log(self, repo_name, direct, branch_id, commit_id, log_content):
async with self._async_session() as session:
async with session.begin():
stmt = update(LogDO).where(LogDO.repo_name == repo_name, LogDO.branch_id == branch_id). \
values(
sync_direct=direct,
commit_id=commit_id,
log=log_content,
# log_history=func.CONCAT(LogDO.log_history, log_content),
update_at=func.now()
)
await session.execute(stmt)
await session.commit()

24
src/do/account.py Normal file
View File

@ -0,0 +1,24 @@
from sqlalchemy import Column, String, Integer, DateTime
from .data_object import DataObject
class GithubAccountDO(DataObject):
__tablename__ = 'github_account'
id = Column(Integer, primary_key=True, autoincrement=True)
domain = Column(String(20))
nickname = Column(String(20))
account = Column(String(20))
email = Column(String(20))
create_time = Column(DateTime)
update_time = Column(DateTime)
class GiteeAccountDO(DataObject):
__tablename__ = 'gitee_account'
id = Column(Integer, primary_key=True, autoincrement=True)
domain = Column(String(20))
nickname = Column(String(20))
account = Column(String(20))
email = Column(String(20))
create_time = Column(DateTime)
update_time = Column(DateTime)

8
src/do/data_object.py Normal file
View File

@ -0,0 +1,8 @@
from sqlalchemy.ext.declarative import declarative_base
__all__ = ("DataObject")
DataObject = declarative_base()

12
src/do/log.py Normal file
View File

@ -0,0 +1,12 @@
from typing import Text
from sqlalchemy import Column, String, Integer, Text, DateTime, Boolean
from .data_object import DataObject
class LogDO(DataObject):
__tablename__ = 'sync_log'
id = Column(Integer, primary_key=True, autoincrement=True)
sync_job_id = Column(Integer)
log_type = Column(String(50))
log = Column(String(500))
create_time = Column(DateTime)

20
src/do/pull_request.py Normal file
View File

@ -0,0 +1,20 @@
from typing import Text
from sqlalchemy import Column, String, Integer, Text, DateTime, Boolean
from .data_object import DataObject
class PullRequestDO(DataObject):
__tablename__ = 'pull_request'
id = Column(Integer, primary_key=True, autoincrement=True)
pull_request_id = Column(Integer)
title = Column(Text)
project = Column(String(20))
type = Column(String(20))
address = Column(String(50))
author = Column(String(20))
email = Column(String(50))
target_branch = Column(String(50))
inline = Column(Boolean)
latest_commit = Column(String(50))
create_time = Column(DateTime)
update_time = Column(DateTime)

36
src/do/sync.py Normal file
View File

@ -0,0 +1,36 @@
from sqlalchemy import Column, String, Integer, Boolean, DateTime
from .data_object import DataObject
class ProjectDO(DataObject):
__tablename__ = 'sync_project'
id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(String(20))
github = Column(String(50))
gitee = Column(String(50))
gitlab = Column(String(50))
gitlink = Column(String(50))
code_china = Column(String(50))
github_token = Column(String(50))
gitee_token = Column(String(50))
code_china_token = Column(String(50))
gitlink_token = Column(String(50))
create_time = Column(DateTime)
update_time = Column(DateTime)
class JobDO(DataObject):
__tablename__ = 'sync_job'
id = Column(Integer, primary_key=True, autoincrement=True)
project = Column(String(20))
status = Column(Boolean)
type = Column(String(20))
github_branch = Column(String(50))
gitee_branch = Column(String(50))
gitlab_branch = Column(String(50))
code_china_branch = Column(String(50))
gitlink_branch = Column(String(50))
create_time = Column(DateTime)
update_time = Column(DateTime)
commit = Column(String(50))
base = Column(String(20))

60
src/do/sync_config.py Normal file
View File

@ -0,0 +1,60 @@
from sqlalchemy import Column, String, Integer, Boolean, TIMESTAMP, text, Enum, Text
from .data_object import DataObject
import enum
class SyncType(enum.Enum):
"""
仓库级别同步 -> all -> 1
分支级别同步 -> one -> 2
"""
all = 1
one = 2
class SyncDirect(enum.Enum):
"""
仓库/分支由内向外同步 -> to_outer -> 1
仓库/分支由外向内同步 -> to_inter -> 2
"""
to_outer = 1
to_inter = 2
class SyncRepoMapping(DataObject):
__tablename__ = 'sync_repo_mapping'
id = Column(Integer, primary_key=True)
repo_name = Column(String(128), unique=True, nullable=False, comment="仓库名称")
enable = Column(Boolean, default=True, comment="是否启用同步")
internal_repo_address = Column(String, nullable=False, comment="内部仓库地址")
external_repo_address = Column(String, nullable=False, comment="外部仓库地址")
sync_granularity = Column(Enum(SyncType), comment="同步类型")
sync_direction = Column(Enum(SyncDirect), comment="首次同步方向")
created_at = Column(TIMESTAMP, server_default=text('CURRENT_TIMESTAMP'), comment="创建时间")
class SyncBranchMapping(DataObject):
__tablename__ = 'sync_branch_mapping'
id = Column(Integer, primary_key=True)
enable = Column(Boolean, default=True, comment="是否启用同步")
repo_id = Column(Integer, nullable=False, comment="关联的仓库id")
internal_branch_name = Column(String, nullable=False, comment="内部仓库分支名")
external_branch_name = Column(String, nullable=False, comment="外部仓库分支名")
created_at = Column(TIMESTAMP, server_default=text('CURRENT_TIMESTAMP'), comment="创建时间")
class LogDO(DataObject):
__tablename__ = 'repo_sync_log'
id = Column(Integer, primary_key=True, autoincrement=True)
branch_id = Column(Integer, nullable=True, comment="分支id")
repo_name = Column(String(128), unique=True, nullable=False, comment="仓库名称")
commit_id = Column(String(128), nullable=True, comment="commit id")
sync_direct = Column(Enum(SyncDirect), comment="同步方向")
log = Column(Text, comment="同步日志")
# log_history = Column(Text, comment="历史日志")
created_at = Column(TIMESTAMP, server_default=text('CURRENT_TIMESTAMP'), comment="创建时间")
update_at = Column(TIMESTAMP, server_default=text('CURRENT_TIMESTAMP'), comment="更新时间")

30
src/dto/account.py Normal file
View File

@ -0,0 +1,30 @@
from fastapi.datastructures import Default
from pydantic import BaseModel
from fastapi import Body
class Account(BaseModel):
id: int
domain: str
nickname: str
class GithubAccount(Account):
account: str
email: str
class GiteeAccount(Account):
account: str
email: str
class CreateAccountItem(BaseModel):
domain: str
nickname: str
account: str
email: str
class UpdateAccountItem(CreateAccountItem):
id: int

7
src/dto/auth.py Normal file
View File

@ -0,0 +1,7 @@
from fastapi import Body
from pydantic import BaseModel
class AuthItem(BaseModel):
type: str = Body(..., description="仓库类型")
token: str = Body(..., description="账号token")

14
src/dto/log.py Normal file
View File

@ -0,0 +1,14 @@
from typing import Optional
from fastapi.datastructures import Default
from pydantic import BaseModel
from datetime import datetime
class Log(BaseModel):
id: int
sync_job_id: int
commit_id: Optional[int]
pull_request_id: Optional[int]
log_type: str
log: str
create_time: datetime

31
src/dto/pull_request.py Normal file
View File

@ -0,0 +1,31 @@
from fastapi.datastructures import Default
from pydantic import BaseModel
from src.common.repo import RepoType
class PullRequest(BaseModel):
id: int
title: str
project: str
type: str
address: str
author: str
email: str
target_branch: str
latest_commit: str
class GithubPullRequest(PullRequest):
type = RepoType.Github
class GiteePullRequest(PullRequest):
type = RepoType.Gitee
class GitlabPullRequest(PullRequest):
type = RepoType.Gitlab
class GitcodePullRequest(PullRequest):
type = RepoType.Gitcode

68
src/dto/sync.py Normal file
View File

@ -0,0 +1,68 @@
from fastapi import Body
from pydantic import BaseModel
from typing import Optional
from enum import Enum
from datetime import datetime
class Color(Enum):
red = 0
green = 1
class SyncType:
OneWay = "OneWay"
TwoWay = "TwoWay"
class Project(BaseModel):
id: int
name: str
github_address: Optional[str]
gitee_address: Optional[str]
gitlab_address: Optional[str]
code_china_address: Optional[str]
gitlink_address: Optional[str]
github_token: Optional[str]
gitee_token: Optional[str]
code_china_token: Optional[str]
gitlink_token: Optional[str]
class CreateProjectItem(BaseModel):
name: str = Body(..., description="合并工程名字")
github_address: str = Body(None, description="GitHub地址")
gitlab_address: str = Body(None, description="Gitlab地址")
gitee_address: str = Body(None, description="Gitee地址")
code_china_address: str = Body(None, description="CodeChina地址")
gitlink_address: str = Body(None, description="Gitlink地址")
github_token: str = Body(None, description="GitHub账户token")
gitee_token: str = Body(None, description="Gitee账户token")
code_china_token: str = Body(None, description="CodeChina账户token")
gitlink_token: str = Body(None, description="Gitlink账户token")
class Job(BaseModel):
id: int
project: str
status: Color
type: str
github_branch: Optional[str]
gitee_branch: Optional[str]
gitlab_branch: Optional[str]
code_china_branch: Optional[str]
gitlink_branch: Optional[str]
commit: str
base: Optional[str]
create_time: datetime
update_time: datetime
class CreateJobItem(BaseModel):
github_branch: str = Body(None, description="GitHub分支名")
gitlab_branch: str = Body(None, description="Gitlab分支名")
gitee_branch: str = Body(None, description="Gitee分支名")
code_china_branch: str = Body(None, description="CodeChina分支名")
gitlink_branch: str = Body(None, description="Gitlink分支名")
type: str = Body(..., description="分支同步类型")
base: str = Body(None, description="基础仓库")

67
src/dto/sync_config.py Normal file
View File

@ -0,0 +1,67 @@
from pydantic import BaseModel, Field, validator
from sqlalchemy.sql.sqltypes import Text
class SyncRepoDTO(BaseModel):
repo_name: str = Field(..., description="仓库名称")
enable: bool = Field(..., description="同步状态")
internal_repo_address: str = Field(..., description="内部仓库地址")
external_repo_address: str = Field(..., description="外部仓库地址")
sync_granularity: int = Field(..., description="1 为仓库粒度的同步, 2 为分支粒度的同步")
sync_direction: int = Field(..., description="1 表示内部仓库同步到外部, 2 表示外部仓库同步到内部")
class SyncBranchDTO(BaseModel):
enable: bool = Field(..., description="是否启用分支同步")
internal_branch_name: str = Field(..., description="内部仓库分支名")
external_branch_name: str = Field(..., description="外部仓库分支名")
class RepoDTO(BaseModel):
enable: bool = Field(..., description="是否启用同步")
repo_name: str = Field(..., description="仓库名称")
internal_repo_address: str = Field(..., description="内部仓库地址")
external_repo_address: str = Field(..., description="外部仓库地址")
sync_granularity: str = Field(..., description="1 为仓库粒度的同步, 2 为分支粒度的同步")
sync_direction: str = Field(..., description="1 表示内部仓库同步到外部, 2 表示外部仓库同步到内部")
class AllRepoDTO(BaseModel):
id: int = Field(..., description="分支id")
created_at: str = Field('', description="创建时间")
enable: bool = Field(..., description="是否启用同步")
repo_name: str = Field(..., description="仓库名称")
internal_repo_address: str = Field(..., description="内部仓库地址")
external_repo_address: str = Field(..., description="外部仓库地址")
sync_granularity: str = Field(..., description="1 为仓库粒度的同步, 2 为分支粒度的同步")
sync_direction: str = Field(..., description="1 表示内部仓库同步到外部, 2 表示外部仓库同步到内部")
class GetBranchDTO(BaseModel):
id: int = Field(..., description="分支id")
created_at: str = Field('', description="创建时间")
enable: bool = Field(..., description="是否启用分支同步")
internal_branch_name: str = Field(..., description="内部仓库分支名")
external_branch_name: str = Field(..., description="外部仓库分支名")
class LogDTO(BaseModel):
id: int = Field(..., description="日志id")
branch_id: int = Field(None, description="分支id")
repo_name: str = Field(..., description="仓库名称")
commit_id: str = Field(None, description="commit id")
sync_direct: str = Field(..., description="同步方向")
log: str
# log_history: str
created_at: str = Field('', description="创建时间")
update_at: str = Field('', description="更新时间")
class Config:
arbitrary_types_allowed = True
# class SyncDTO(BaseModel):
# repo_name: str = Field(..., description="仓库名称")
# branch_name: str = Field(..., description="分支名称")
# sync_direct: str = Field(..., description="1 表示内部仓库同步到外部, 2 表示外部仓库同步到内部")

11
src/dto/user.py Normal file
View File

@ -0,0 +1,11 @@
from typing import Union
from pydantic import BaseModel
from pydantic.fields import Field
class UserInfoDto(BaseModel):
name: str = Field(..., description="用户登录名")
nick: str = Field(..., description="花名")
emp_id: Union[int, str] = Field(..., description="工号")
email: str = Field("", description="用户邮箱")
dept: str = Field("", description="用户部门")

14
src/router/__init__.py Normal file
View File

@ -0,0 +1,14 @@
from extras.obfastapi.frame import OBAPIRouter
__all__ = ("CE_ROBOT", "PROJECT", "JOB",
"PULL_REQUEST", "ACCOUNT", "USER", "LOG", "AUTH", "SYNC_CONFIG")
CE_ROBOT = OBAPIRouter(prefix='/cerobot/hirobot', tags=['Robot'])
PROJECT = OBAPIRouter(prefix='/cerobot/projects', tags=['Projects'])
JOB = OBAPIRouter(prefix='/cerobot', tags=['Jobs'])
PULL_REQUEST = OBAPIRouter(prefix='/cerobot', tags=['Pullrequests'])
ACCOUNT = OBAPIRouter(prefix='/cerobot/account', tags=['Account'])
USER = OBAPIRouter(prefix="/cerobot/users", tags=["Users"])
LOG = OBAPIRouter(prefix="/cerobot/log", tags=["Log"])
AUTH = OBAPIRouter(prefix="/cerobot/auth", tags=["Auth"])
SYNC_CONFIG = OBAPIRouter(prefix='/cerobot/sync', tags=['Sync'])

0
src/service/__init__.py Normal file
View File

109
src/service/account.py Normal file
View File

@ -0,0 +1,109 @@
from itertools import count
from typing import List, Union, Optional, Dict
from typing import Any
from src.utils.logger import logger
from .service import Service
from sqlalchemy import text
from src.dao.account import GithubAccountDAO
from src.dao.account import GiteeAccountDAO
from src.do.account import GithubAccountDO, GiteeAccountDO
from src.dto.account import GithubAccount as GithubAccountDTO
from src.dto.account import GiteeAccount as GiteeAccountDTO
from src.dto.account import CreateAccountItem, UpdateAccountItem
from src.base.error_code import Errors
class GithubAccountService(Service):
def __init__(self) -> None:
self.github_account_dao = GithubAccountDAO()
async def list_github_account(self, search: Optional[str] = None) -> Optional[List[GithubAccountDTO]]:
if search is not None:
cond = text(
f"domain like '%{search}%' or nickname like '%{search}%' or account like '%{search}%' or email like '%{search}%'")
all = await self.github_account_dao.fetch(cond=cond)
else:
all = await self.github_account_dao.fetch()
data = []
for account in all:
data.append(self._do_to_dto(account))
return data
async def get_github_account_by_domain(self, domain: str) -> Optional[GithubAccountDTO]:
if domain == "":
return None
cond = text(f"domain = '{domain}'")
all = await self.github_account_dao.fetch(cond=cond)
if len(all) == 0:
return None
else:
return self._do_to_dto(all[0])
async def insert_github_account(self, item: CreateAccountItem) -> Optional[List[GithubAccountDO]]:
cond = text(f"domain like '%{item.domain}%'")
all = await self.github_account_dao.fetch(cond=cond)
if len(all) > 0:
logger.error(
f"Can not save the account because there are one Github account about {item.domain}")
raise Errors.INSERT_FAILD
return await self.github_account_dao.insert_githubAccount(item.domain, item.nickname, item.account, item.email)
async def delete_github_account(self, id: int) -> Optional[bool]:
return await self.github_account_dao.delete_account(id)
async def update_github_account(self, item: UpdateAccountItem) -> Optional[bool]:
return await self.github_account_dao.update_account(item)
async def get_count(self, search: Optional[str] = None) -> int:
if search is not None:
cond = text(
f"domain like '%{search}%' or nickname like '%{search}%' or account like '%{search}%' or email like '%{search}%'")
return await self.github_account_dao._count(cond)
else:
return await self.github_account_dao._count()
def _do_to_dto(self, account: GithubAccountDO) -> GithubAccountDTO:
return GithubAccountDTO(
**{
'id': account["GithubAccountDO"].id,
'domain': account["GithubAccountDO"].domain,
'nickname': account["GithubAccountDO"].nickname,
'account': account["GithubAccountDO"].account,
'email': account["GithubAccountDO"].email
}
)
class GiteeAccountService(Service):
def __init__(self) -> None:
self.gitee_account_dao = GiteeAccountDAO()
async def list_gitee_account(self) -> Optional[List[GiteeAccountDTO]]:
all = await self.gitee_account_dao.list_all()
data = []
for account in all:
data.append(self._do_to_dto(account))
return data
async def insert_gitee_account(self, item: CreateAccountItem) -> Optional[List[GiteeAccountDO]]:
return await self.gitee_account_dao.insert_giteeAccount(item.domain, item.nickname, item.account, item.email)
async def delete_gitee_account(self, domain: str) -> Optional[bool]:
return await self.gitee_account_dao.delete_account(domain)
async def update_gitee_account(self, item: UpdateAccountItem) -> Optional[bool]:
return await self.gitee_account_dao.update_account(item)
async def get_count(self, cond: Any = None) -> int:
return await self.gitee_account_dao._count(GiteeAccountDO, cond)
def _do_to_dto(self, account: GiteeAccountDO) -> GiteeAccountDTO:
return GiteeAccountDTO(
**{
'id': account["GithubAccountDO"].id,
'domain': account["GiteeAccountDO"].domain,
'nickname': account["GiteeAccountDO"].nickname,
'account': account["GiteeAccountDO"].account,
'email': account["GiteeAccountDO"].email
}
)

26
src/service/auth.py Normal file
View File

@ -0,0 +1,26 @@
from typing import List, Union, Optional, Dict
from typing import Any
from fastapi import (
Body
)
from src.utils.logger import logger
from pydantic.main import BaseModel
from sqlalchemy import text
from .service import Service
from src.dto.auth import AuthItem
from src.common.repo import RepoType
from src.utils import github, gitee
class AuthService(Service):
def __init__(self) -> None:
pass
def auth(self, item: AuthItem) -> bool:
if item.type == RepoType.Github:
return github.github_auth(item.token)
elif item.type == RepoType.Gitee:
return gitee.gitee_auth(item.token)
else:
return False

148
src/service/cronjob.py Normal file
View File

@ -0,0 +1,148 @@
import os
import re
import shlex
import subprocess
from typing import List
from src.base import config
from src.base.status_code import GITMSGException, Status, git_error_mapping
from src.base.config import SYNC_DIR
from src.dao.sync_config import SyncRepoDAO, SyncBranchDAO
from src.do.sync_config import SyncDirect, SyncType
from src.dto.sync_config import SyncBranchDTO
from src.utils.sync_log import sync_log, LogType, log_path
from src.service.sync_config import LogService
sync_repo_dao = SyncRepoDAO()
sync_branch_dao = SyncBranchDAO()
log_service = LogService()
# 根据错误码获取枚举实例
def get_git_error(stderr: str):
for error_str, git_error in git_error_mapping.items():
if re.search(error_str, stderr):
return GITMSGException(git_error)
return GITMSGException(Status.UNKNOWN_ERROR)
def shell(cmd, dire: str, log_name: str, user: str):
log = f'Execute cmd: ' + cmd
sync_log(LogType.INFO, log, log_name, user)
output = subprocess.run(shlex.split(cmd), cwd=dire, capture_output=True, text=True)
if output.returncode != 0:
git_error = get_git_error(output.stderr)
if config.LOG_DETAIL:
sync_log(LogType.ERROR, output.stderr, log_name, user)
raise git_error
return output
def init_repos(repo, log_name: str, user: str):
not os.path.exists(SYNC_DIR) and os.makedirs(SYNC_DIR)
repo_dir = os.path.join(SYNC_DIR, repo.repo_name)
if not os.path.exists(repo_dir):
sync_log(LogType.INFO, "初始化仓库 *********", log_name, user)
repo_name = repo.repo_name
if repo.sync_direction == SyncDirect.to_outer:
# 克隆内部仓库到同步目录下
shell(f'git clone -b master {repo.internal_repo_address} {repo_dir}', SYNC_DIR, log_name, user)
else:
# 克隆外部仓库到同步目录下
shell(f'git clone -b master {repo.external_repo_address} {repo_dir}', SYNC_DIR, log_name, user)
# 添加internal远程仓库并强制使用
shell(f'git remote add -f internal {repo.internal_repo_address}', repo_dir, log_name, user)
# 添加external远程仓库并强制使用
shell(f'git remote add -f external {repo.external_repo_address}', repo_dir, log_name, user)
def inter_to_outer(repo, branch, log_name: str, user: str):
repo_dir = os.path.join(SYNC_DIR, repo.repo_name)
inter_name = branch.internal_branch_name
outer_name = branch.external_branch_name
# 从internal仓库的指定分支inter_name中获取代码更新远程分支的信息到本地仓库
shell(f"git fetch internal {inter_name}", repo_dir, log_name, user)
# 切换到inter_name分支并将internal仓库的分支强制 checkout 到当前分支。
shell(f"git checkout -B {inter_name} internal/{inter_name}", repo_dir, log_name, user)
# 将本地仓库的inter_name分支推送到external仓库的outer_name分支上。
shell(f"git push external {inter_name}:{outer_name}", repo_dir, log_name, user)
# commit id
result = shell(f"git log HEAD~1..HEAD --oneline", repo_dir, log_name, user)
commit_id = result.stdout.split(" ")[0]
sync_log(LogType.INFO, f'[COMMIT ID: {commit_id}]', log_name, user)
return commit_id
def outer_to_inter(repo, branch, log_name: str, user: str):
repo_dir = os.path.join(SYNC_DIR, repo.repo_name)
inter_name = branch.internal_branch_name
outer_name = branch.external_branch_name
# 从external仓库的指定分支outer_name中获取代码更新远程分支的信息到本地仓库
shell(f"git fetch external {outer_name}", repo_dir, log_name, user)
# 切换到本地仓库的outer_name分支并将origin仓库的outer_name分支强制 checkout 到当前分支。
shell(f"git checkout -B {outer_name} external/{outer_name}", repo_dir, log_name, user)
# 将本地仓库的outer_name分支推送到internal仓库的inter_name分支上。
shell(f"git push internal {outer_name}:{inter_name}", repo_dir, log_name, user)
# commit id
result = shell(f"git log HEAD~1..HEAD --oneline", repo_dir, log_name, user)
commit_id = result.stdout.split(" ")[0]
sync_log(LogType.INFO, f'[COMMIT ID: {commit_id}]', log_name, user)
return commit_id
async def sync_repo_task(repo, user):
if repo.sync_granularity == SyncType.one:
branches = await sync_branch_dao.sync_branch(repo_id=repo.id)
await sync_branch_task(repo, branches, repo.sync_direction, user)
else:
log_name = f'sync_{repo.repo_name}.log'
init_repos(repo, log_name, user)
sync_log(LogType.INFO, f'************ 执行{repo.repo_name}仓库同步 ************', log_name, user)
if repo.sync_direction == SyncDirect.to_outer:
stm = shell(f"git ls-remote --heads {repo.internal_repo_address}", SYNC_DIR, log_name, user)
branch_list_output = stm.stdout.split('\n')
for branch in branch_list_output:
if branch:
branch_name = branch.split('/')[-1].strip()
branch = SyncBranchDTO(enable=1, internal_branch_name=branch_name, external_branch_name=branch_name)
sync_log(LogType.INFO, f'Execute inter to outer {branch_name} branch Sync', log_name, user)
inter_to_outer(repo, branch, log_name, user)
else:
stm = shell(f"git ls-remote --heads {repo.external_repo_address}", SYNC_DIR, log_name, user)
branch_list_output = stm.stdout.split('\n')
for branch in branch_list_output:
if branch:
branch_name = branch.split('/')[-1].strip()
branch = SyncBranchDTO(enable=1, internal_branch_name=branch_name, external_branch_name=branch_name)
sync_log(LogType.INFO, f'Execute outer to inter {branch_name} branch Sync', log_name, user)
outer_to_inter(repo, branch, log_name, user)
if config.DELETE_SYNC_DIR:
os.path.exists(SYNC_DIR) and os.removedirs(SYNC_DIR)
sync_log(LogType.INFO, f'删除同步工作目录: {SYNC_DIR}', log_name, user)
sync_log(LogType.INFO, f'************ {repo.repo_name}仓库同步完成 ************', log_name, user)
await log_service.insert_repo_log(repo_name=repo.repo_name, direct=repo.sync_direction)
os.remove(os.path.join(log_path, log_name))
async def sync_branch_task(repo, branches, direct, user):
for branch in branches:
log_name = f'sync_{repo.repo_name}_{branch.id}.log'
init_repos(repo, log_name, user)
sync_log(LogType.INFO, f'************ 执行分支同步 ************', log_name, user)
if direct == SyncDirect.to_inter:
sync_log(LogType.INFO, f'Execute outer to inter {branch.external_branch_name} branch Sync', log_name, user)
commit_id = outer_to_inter(repo, branch, log_name, user)
else:
sync_log(LogType.INFO, f'Execute inter to outer {branch.internal_branch_name} branch Sync', log_name, user)
commit_id = inter_to_outer(repo, branch, log_name, user)
if config.DELETE_SYNC_DIR:
os.path.exists(SYNC_DIR) and os.removedirs(SYNC_DIR)
sync_log(LogType.INFO, f'删除同步工作目录: {SYNC_DIR}', log_name, user)
sync_log(LogType.INFO, f'************ 分支同步完成 ************', log_name, user)
await log_service.insert_branch_log(repo.repo_name, direct, branch.id, commit_id)
os.remove(os.path.join(log_path, log_name))

48
src/service/log.py Normal file
View File

@ -0,0 +1,48 @@
from typing import List, Union, Optional, Dict
from typing import Any
from fastapi import (
Body
)
from src.utils.logger import logger
from pydantic.main import BaseModel
from sqlalchemy import text
from .service import Service
from src.dao.log import LogDAO
from src.do.log import LogDO
from src.dto.log import Log as LogDTO
class LogService(Service):
def __init__(self) -> None:
self._log_dao = LogDAO()
async def get_logs_by_job(self, id, page: int = 1, size: int = 10) -> Optional[List[LogDTO]]:
cond = text(f"sync_job_id = {id}")
start = (page - 1) * size
all = await self._log_dao.fetch(cond=cond, start=start, limit=size)
data = []
for log in all:
data.append(self._do_to_dto(log))
return data
async def save_logs(self, id, type, msg) -> Optional[LogDTO]:
return await self._log_dao.insert_log(id, type, msg)
async def delete_logs(self) -> Optional[bool]:
return await self._log_dao.delete_log()
async def count_logs(self, id) -> int:
cond = text(f"sync_job_id = {id}")
return await self._log_dao._get_count(cond=cond)
def _do_to_dto(self, log: LogDO) -> LogDTO:
return LogDTO(
**{
'id': log[LogDO].id,
'sync_job_id': log[LogDO].sync_job_id,
'log_type': log[LogDO].log_type,
'log': log[LogDO].log,
'create_time': log[LogDO].create_time
}
)

152
src/service/pull_request.py Normal file
View File

@ -0,0 +1,152 @@
import time
from typing import List, Union, Optional, Dict
from typing import Any
from unicodedata import name
from sqlalchemy.sql.expression import false
from .service import Service
from sqlalchemy import text
from src.utils.logger import logger
from src.dto.pull_request import PullRequest as PullRequestDTO
from src.dao.pull_request import PullRequestDAO
from src.do.pull_request import PullRequestDO
from src.common.repo_factory import RepoFactory, PullRequestFactory
from src.common.github import PullRequest
class PullRequestService(Service):
def __init__(self) -> None:
self.pull_request_dao = PullRequestDAO()
async def fetch_pull_request(self, project: str, search: Optional[str] = None) -> Optional[List[PullRequestDTO]]:
if search:
cond = f"project = '{project}' and (title like '%{search}%' or pull_request_id = {search})"
else:
cond = f"project = '{project}'"
all = await self.pull_request_dao.fetch(text(cond))
if not all:
return None
data = []
for pr in all:
data.append(self._do_to_dto(pr))
return data
async def delete_pull_request(self, id: str) -> Optional[bool]:
return await self.pull_request_dao.delete_pull_request(id)
async def count_pull_request(self, project: str, search: Optional[str] = None) -> int:
if search is not None:
cond = f"project = '{project}' and (title like '%{search}%' or pull_request_id = {search})"
else:
cond = f"project = '{project}'"
return await self.pull_request_dao._count(PullRequestDO, text(cond))
async def sync_pull_request(self, project, organization, repo: str) -> int:
logger.info(f"sync the repo {repo} of {organization}")
github = RepoFactory.create('Github', project, organization, repo)
if github is None:
logger.error(f"create repo object failed")
return None
github.fetch_pull_request()
await github.save_pull_request()
async def merge_pull_request(self, organization, repo: str, id: int) -> Optional[bool]:
pull_request = PullRequestFactory.create(
'Github', organization, repo, id)
if pull_request is None:
logger.error(f"create pull request object failed")
return None
return pull_request.comment("/merge")
async def merge_pull_request_code(self, organization, repo: str, id: int) -> Optional[bool]:
pull_request = PullRequestFactory.create(
'Github', organization, repo, id)
if pull_request is None:
logger.error(f"create pull request object failed")
return None
return pull_request._send_merge_request()
async def approve_pull_request(self, organization, repo: str, id: int) -> Optional[bool]:
pull_request = PullRequestFactory.create(
'Github', organization, repo, id)
if pull_request is None:
logger.error(f"create pull request object failed")
return None
return pull_request.approve()
async def press_pull_request(self, organization, repo: str, id: int) -> Optional[bool]:
# TODO
pull_request = PullRequestFactory.create(
'Github', organization, repo, id)
if pull_request is None:
logger.error(f"create pull request object failed")
return None
pass
async def close_pull_request(self, organization, repo: str, id: int) -> Optional[bool]:
pull_request = PullRequestFactory.create(
'Github', organization, repo, id)
if pull_request is None:
logger.error(f"create pull request object failed")
return None
return pull_request.close()
async def get_count(self, cond: Any = None) -> int:
return await self.pull_request_dao._count(PullRequestDO, cond)
async def judge_pull_request_need_merge(self, project, organization, repo: str, id: int) -> Optional[bool]:
cond = f"project = '{project}' and pull_request_id = {id}"
all = await self.pull_request_dao.fetch(text(cond))
if all is None or len(all) == 0:
return false
pull_request = PullRequestFactory.create(
'Github', organization, repo, id)
if pull_request is None:
logger.error(f"create pull request object failed")
return None
comment_merge = await pull_request.check_if_merge()
if not comment_merge:
return False
elif not all[0]["PullRequestDO"].inline and comment_merge:
return True
else:
# check if the pull request has new commit
ans = await self.judge_pull_request_has_newer_commit(project, id, pull_request)
return ans
async def judge_pull_request_has_newer_commit(self, project, id, pull_request: PullRequest) -> Optional[bool]:
cond = f"project = '{project}' and pull_request_id = {id}"
ans = await self.pull_request_dao.fetch(text(cond))
if ans is None or len(ans) == 0:
return false
else:
origin_latest_commit = ans[0]["PullRequestDO"].latest_commit
latest_commit = pull_request.get_latest_commit()
if origin_latest_commit == latest_commit:
return True
else:
return False
async def update_latest_commit(self, pull: PullRequestDTO):
return await self.pull_request_dao.update_latest_commit(pull.project, pull.id, pull.latest_commit)
async def update_inline_status(self, pull: PullRequestDTO, inline: bool) -> Optional[bool]:
return await self.pull_request_dao.update_inline_status(pull.project, pull.id, inline)
def _do_to_dto(self, pr: PullRequestDO) -> PullRequestDTO:
return PullRequestDTO(
**{
'id': pr["PullRequestDO"].pull_request_id,
'title': pr["PullRequestDO"].title,
'project': pr["PullRequestDO"].project,
'type': pr["PullRequestDO"].type,
'address': pr["PullRequestDO"].address,
'author': pr["PullRequestDO"].author,
'email': pr["PullRequestDO"].email,
'target_branch': pr["PullRequestDO"].target_branch,
'latest_commit': pr["PullRequestDO"].latest_commit
}
)

9
src/service/service.py Normal file
View File

@ -0,0 +1,9 @@
# coding: utf-8
from typing import Union
class Service(object):
@staticmethod
def formate_string(s: Union[bytes, str]) -> str:
return s.decode(errors='replace') if isinstance(s, bytes) else s

173
src/service/sync.py Normal file
View File

@ -0,0 +1,173 @@
from typing import List, Union, Optional, Dict
from typing import Any
from fastapi import (
Body
)
from src.utils.logger import logger
from pydantic.main import BaseModel
from sqlalchemy import text
from .service import Service
from src.dao.sync import ProjectDAO, JobDAO
from src.do.sync import ProjectDO, JobDO
from src.dto.sync import Project as ProjectDTO
from src.dto.sync import Job as JobDTO
from src.dto.sync import CreateProjectItem, CreateJobItem
class ProjectService(Service):
def __init__(self) -> None:
self._project_dao = ProjectDAO()
async def list_projects(self, page: int = 1, size: int = 10) -> Optional[List[ProjectDTO]]:
start = (page - 1) * size
all = await self._project_dao.fetch(start=start, limit=size)
data = []
for project in all:
data.append(self._do_to_dto(project))
return data
async def insert_project(self, item: CreateProjectItem) -> Optional[ProjectDTO]:
# we need to send request to check the input if illegal
count = await self.get_count(text(f"name='{item.name}'"))
if count > 0:
logger.info(f"the project {item.name} is exist in the database")
return None
return await self._project_dao.insert_project(item)
async def delete_project(self, id: str) -> Optional[bool]:
return await self._project_dao.delete_project(id)
async def search_project(self, id: Optional[int] = None, name: Optional[str] = None) -> Optional[List[ProjectDTO]]:
limit = None
start = 0
if name:
all = await self._project_dao.fetch(text(f"name like '%{name}%'"), limit, start)
elif id:
all = await self._project_dao.fetch(text(f"id='{id}'"), limit, start)
else:
all = []
data = []
if len(all) > 0:
for project in all:
data.append(self._do_to_dto(project))
return data
async def get_count(self, cond: Any = None) -> int:
return await self._project_dao._get_count(cond)
async def get_count_by_search(self, name: str) -> int:
return await self.get_count(text(f"name like '%{name}%'"))
def _do_to_dto(self, project) -> ProjectDTO:
return ProjectDTO(
**{
'id': project["ProjectDO"].id,
'name': project["ProjectDO"].name,
'github_address': project["ProjectDO"].github,
'gitee_address': project["ProjectDO"].gitee,
'gitlab_address': project["ProjectDO"].gitlab,
'code_china_address': project["ProjectDO"].code_china,
'gitlink_address': project["ProjectDO"].gitlink,
'github_token': project["ProjectDO"].github_token,
'gitee_token': project["ProjectDO"].gitee_token
}
)
class JobService(Service):
def __init__(self) -> None:
self._job_dao = JobDAO()
async def list_jobs(self, project: Optional[str] = None, search: Optional[str] = None, source: Optional[str] = None, page: int = 1, size: int = 20) -> Optional[List[JobDTO]]:
cond = None
start = (page - 1) * size
if project is None:
all = await self._job_dao.fetch()
else:
cond = text(f"project = '{project}'")
if search is not None:
cond = text(
f"project = '{project}' and github_branch like '%{search}%'")
if source is not None:
cond = text(
f"project = '{project}' and LENGTH({source}) !=0")
all = await self._job_dao.fetch(cond, start=start, limit=size)
if not all:
return None
data = []
for job in all:
data.append(self._do_to_dto(job))
return data
async def source_list_jobs(self, source: Optional[str] = None) -> Optional[List[JobDTO]]:
cond = None
if source is not None:
cond = text(f"status = 1 and LENGTH({source}) !=0")
all = await self._job_dao.fetch(cond)
data = []
for job in all:
data.append(self._do_to_dto(job))
return data
async def get_job(self, id: int) -> Optional[JobDTO]:
job = await self._job_dao.fetch(text(f"id = {id}"))
if not job:
return None
return self._do_to_dto(job[0])
async def create_job(self, project, item: CreateJobItem) -> Optional[List[JobDO]]:
# we do not need to check if exist the same github branch
# we can sync a branch to different branches
return await self._job_dao.insert_job(project, item)
async def delete_job(self, id: int) -> Optional[bool]:
return await self._job_dao.delete_job(id)
async def get_count(self, cond: Any = None) -> int:
return await self._job_dao._count(JobDO, cond)
async def count_job(self, project: Optional[str] = None, search: Optional[str] = None, source: Optional[str] = None) -> int:
if not project:
return await self.get_count()
else:
cond = text(f"project = '{project}'")
if search is not None:
cond = text(
f"project = '{project}' and github_branch like '%{search}%'")
if source is not None:
cond = text(
f"project = '{project}' and LENGTH({source}) !=0")
return await self.get_count(cond, )
async def update_status(self, _id: int, _status: bool) -> bool:
return await self._job_dao.update_status(_id, _status)
async def get_job_lateset_commit(self, id: int) -> Optional[str]:
job = await self.get_job(id)
if job:
return job.commit
else:
return None
async def update_job_lateset_commit(self, _id: int, _commit: str) -> bool:
return await self._job_dao.update_commit(_id, _commit)
def _do_to_dto(self, job) -> JobDTO:
return JobDTO(
**{
'id': job["JobDO"].id,
'project': job["JobDO"].project,
'status': job["JobDO"].status,
'type': job["JobDO"].type,
'github_branch': job["JobDO"].github_branch,
'gitee_branch': job["JobDO"].gitee_branch,
'gitlab_branch': job["JobDO"].gitlab_branch,
'code_china_branch': job["JobDO"].code_china_branch,
'gitlink_branch': job["JobDO"].gitlink_branch,
'commit': job["JobDO"].commit,
'base': job["JobDO"].base,
'create_time': job["JobDO"].create_time,
'update_time': job["JobDO"].update_time
}
)

186
src/service/sync_config.py Normal file
View File

@ -0,0 +1,186 @@
import re
from typing import List, Union, Optional, Dict
from .service import Service
from src.dao.sync_config import SyncBranchDAO, SyncRepoDAO, LogDAO
from src.dto.sync_config import SyncBranchDTO, SyncRepoDTO, RepoDTO, AllRepoDTO, GetBranchDTO, LogDTO
from src.do.sync_config import SyncDirect, SyncType
from src.base.status_code import Status, SYNCException
from src.utils.sync_log import log_path
class SyncService(Service):
def __init__(self) -> None:
self.sync_repo_dao = SyncRepoDAO()
self.sync_branch_dao = SyncBranchDAO()
async def same_name_repo(self, repo_name: str) -> bool:
instances = await self.sync_repo_dao.get(repo_name=repo_name)
if instances is None:
return False
return True
async def create_repo(self, dto: SyncRepoDTO) -> Optional[RepoDTO]:
repo = await self.sync_repo_dao.create_repo(dto)
return repo
async def check_status(self, repo_name: str, dto: SyncBranchDTO) -> int:
repo = await self.sync_repo_dao.get(repo_name=repo_name)
if repo is None:
raise SYNCException(Status.REPO_NOTFOUND)
stm = {"repo_id": repo.id, "internal_branch_name": dto.internal_branch_name,
"external_branch_name": dto.external_branch_name}
branches = await self.sync_branch_dao.get(**stm)
if repo.sync_granularity == SyncType.all:
raise SYNCException(Status.GRANULARITY_ERROR)
if branches is not None:
raise SYNCException(Status.BRANCH_EXISTS)
return repo.id
async def create_branch(self, dto: SyncBranchDTO, repo_id: int) -> Optional[SyncBranchDTO]:
branch = await self.sync_branch_dao.create_branch(dto, repo_id=repo_id)
return branch
async def get_sync_repo(self, page_num: int, page_size: int, create_sort: bool) -> Optional[List[AllRepoDTO]]:
repos = await self.sync_repo_dao.get_sync_repo(page_number=page_num,
page_size=page_size, create_sort=create_sort)
return repos
async def get_sync_branches(self, repo_id: int, page_num: int,
page_size: int, create_sort: bool) -> Optional[List[GetBranchDTO]]:
branches = await self.sync_branch_dao.get_sync_branch(repo_id=repo_id, page_number=page_num,
page_size=page_size, create_sort=create_sort)
return branches
async def sync_branch(self, repo_id: int, branch_name: str, dire: SyncDirect) -> Optional[List[GetBranchDTO]]:
branches = await self.sync_branch_dao.get_branch(repo_id=repo_id, branch_name=branch_name, dire=dire)
return branches
async def get_repo_id(self, repo_name: str) -> int:
repo = await self.sync_repo_dao.get(repo_name=repo_name)
if repo is None:
raise SYNCException(Status.REPO_NOTFOUND)
if repo.sync_granularity == SyncType.all:
raise SYNCException(Status.GRANULARITY)
return repo.id
async def get_repo(self, repo_name: str):
instances = await self.sync_repo_dao.get(repo_name=repo_name)
return instances
async def get_all_repo(self):
repos = await self.sync_repo_dao.filter(enable=1)
return repos
async def delete_repo(self, repo_name: str) -> SYNCException:
repo = await self.sync_repo_dao.get(repo_name=repo_name)
if repo is None:
return SYNCException(Status.REPO_NOTFOUND)
branches = await self.sync_branch_dao.filter(repo_id=repo.id)
if len(branches) > 0:
for branch in branches:
await self.sync_branch_dao.delete(branch)
await self.sync_repo_dao.delete(repo)
return SYNCException(Status.SUCCESS)
async def delete_branch(self, repo_name: str, branch_name: str) -> SYNCException:
repo = await self.sync_repo_dao.get(repo_name=repo_name)
if repo is None:
return SYNCException(Status.REPO_NOTFOUND)
if repo.sync_granularity == SyncType.all:
return SYNCException(Status.GRANULARITY_DELETE)
if repo.sync_direction == SyncDirect.to_outer:
stm = {"repo_id": repo.id, "internal_branch_name": branch_name}
else:
stm = {"repo_id": repo.id, "external_branch_name": branch_name}
branches = await self.sync_branch_dao.filter(**stm)
if branches is None:
return SYNCException(Status.BRANCH_DELETE)
for branch in branches:
await self.sync_branch_dao.delete(branch)
return SYNCException(Status.SUCCESS)
async def update_repo(self, repo_name: str, enable: bool) -> SYNCException:
repo = await self.sync_repo_dao.get(repo_name=repo_name)
if repo is None:
return SYNCException(Status.REPO_NOTFOUND)
await self.sync_repo_dao.update(repo, enable=enable)
return SYNCException(Status.SUCCESS)
async def update_branch(self, repo_name: str, branch_name: str, enable: bool) -> SYNCException:
repo = await self.sync_repo_dao.get(repo_name=repo_name)
if repo is None:
return SYNCException(Status.REPO_NOTFOUND)
if repo.sync_direction == SyncDirect.to_outer:
stm = {"repo_id": repo.id, "internal_branch_name": branch_name}
else:
stm = {"repo_id": repo.id, "external_branch_name": branch_name}
branches = await self.sync_branch_dao.filter(**stm)
if branches is None:
return SYNCException(Status.BRANCH_DELETE)
if repo.enable == 0 and enable:
await self.sync_repo_dao.update(repo, enable=enable)
for branch in branches:
await self.sync_branch_dao.update(branch, enable=enable)
return SYNCException(Status.SUCCESS)
class LogService(Service):
def __init__(self) -> None:
self.sync_log_dao = LogDAO()
async def insert_repo_log(self, repo_name: str, direct: str):
addr = f"{log_path}/sync_{repo_name}.log"
with open(addr, 'r') as fd:
log_content = fd.read()
log_history = f"{log_path}/sync_{repo_name}_history.log"
with open(log_history, 'a') as log_:
log_.write(log_content)
stm = {"repo_name": repo_name, "branch_id": None, "commit_id": None}
log = await self.sync_log_dao.filter(**stm)
if len(log) < 1:
await self.sync_log_dao.init_sync_repo_log(repo_name=repo_name, direct=direct, log_content=log_content)
else:
await self.sync_log_dao.update_sync_repo_log(repo_name=repo_name, direct=direct, log_content=log_content)
async def insert_branch_log(self, repo_name: str, direct: str, branch_id: int, commit_id: str):
addr = f"{log_path}/sync_{repo_name}_{branch_id}.log"
with open(addr, 'r') as fd:
log_content = fd.read()
log_history = f"{log_path}/sync_{repo_name}_{branch_id}_history.log"
with open(log_history, 'a') as log_:
log_.write(log_content)
stm = {"repo_name": repo_name, "branch_id": branch_id}
log = await self.sync_log_dao.filter(**stm)
if len(log) < 1:
await self.sync_log_dao.init_branch_log(repo_name, direct, branch_id, commit_id, log_content)
else:
await self.sync_log_dao.update_branch_log(repo_name, direct, branch_id, commit_id, log_content)
async def get_logs(self, repo_name: str, branch_id: int) -> Optional[List[LogDTO]]:
stm = {"repo_name": repo_name, "branch_id": branch_id}
logs_repo = await self.sync_log_dao.filter(**stm)
datas = []
for do in logs_repo:
data = LogDTO(
id=do.id,
branch_id=do.branch_id,
repo_name=do.repo_name,
commit_id=do.commit_id,
sync_direct=do.sync_direct.name,
log=str(do.log),
# log_history=str(do.log_history),
created_at=str(do.created_at),
update_at=str(do.update_at)
)
datas.append(data)
return datas

27
src/utils/author.py Normal file
View File

@ -0,0 +1,27 @@
from typing import Optional
from src.dto.account import GithubAccount
from src.service.account import GithubAccountService
def get_author_domain(aliemail: str):
# for example ali email
if aliemail == "":
return None
domain = aliemail.split("@", 1)[0]
return domain
async def get_github_author_and_email(domain: str) -> Optional[GithubAccount]:
service = GithubAccountService()
ans = await service.get_github_account_by_domain(domain)
if ans is None:
# return the author short name and oceanbase.com eamil
return {
'author': 'obdev',
'email': 'obdev'
}
else:
return {
'author': ans.account,
'email': ans.email
}

28
src/utils/base.py Normal file
View File

@ -0,0 +1,28 @@
import re
GIT_HTTP_PATTERN = r'https://.*.com/(.*)/(.*).git'
GIT_SSH_PATTERN = r'git@.*.com:(.*)/(.*).git'
def check_addr(repo_address: str) -> bool:
try:
if repo_address.startswith('https'):
pattern = GIT_HTTP_PATTERN
else:
pattern = GIT_SSH_PATTERN
match_obj = re.match(pattern, repo_address, re.M | re.I)
if match_obj:
return True
else:
return False
except:
return False
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super().__call__(*args, **kwargs)
return cls._instances[cls]

19
src/utils/cmd.py Normal file
View File

@ -0,0 +1,19 @@
import shlex
import json
import subprocess
from src.utils.logger import logger, Log
from src.base.code import LogType
from typing import Any
async def shell(cmd, dir: str, job=None, env: Any = None):
log = 'Run cmd ' + cmd
await Log(LogType.INFO, log, job.id)
try:
output = subprocess.run(shlex.split(cmd), cwd=dir,
capture_output=True, text=True, env=env)
return output.stdout, output.stderr
except subprocess.CalledProcessError as e:
if e.output.startswith('error:'):
error = json.loads(e.output[7:])
logger.error(f"{error['code']}:{error['message']}")

27
src/utils/gitcode.py Normal file
View File

@ -0,0 +1,27 @@
import re
gitcode_http_partten = r'https://gitcode.net/(.*)/(.*)'
gitcode_ssh_partten = r'git@gitcode.net:(.*)/(.*).git'
def check_gitcode_address(address: str) -> bool:
try:
if address.startswith('https'):
partten = gitcode_http_partten
else:
partten = gitcode_ssh_partten
matchObj = re.match(partten, address, re.M | re.I)
if matchObj:
return True
else:
return False
except:
return False
def gitcode_auth(token: str) -> bool:
pass
def get_gitcode_address_with_token(http_address: str, token: str) -> str:
pass

39
src/utils/gitee.py Normal file
View File

@ -0,0 +1,39 @@
import re
import requests
from typing import Optional
from src.base import config
from src.utils.logger import logger
gitee_http_partten = r'https://gitee.com/(.*)/(.*)'
gitee_ssh_partten = r'git@gitee.com:(.*)/(.*).git'
def check_gitee_address(address: str) -> bool:
try:
if address.startswith('https'):
partten = gitee_http_partten
else:
partten = gitee_ssh_partten
matchObj = re.match(partten, address, re.M | re.I)
if matchObj:
return True
else:
return False
except:
return False
def gitee_auth(token: str) -> bool:
pass
def get_gitee_address_with_token(http_address: str, token: str) -> str:
try:
if not http_address.startswith('http'):
raise Exception('http address is error')
if token == "":
raise Exception('token is empty')
owner_name = http_address[8:].split("/")[1]
return http_address[:8] + owner_name + ":" + token + '@' + http_address[8:]
except Exception as e:
print(e)

65
src/utils/github.py Normal file
View File

@ -0,0 +1,65 @@
import re
from src.common import crawler
github_http_partten = r'https://github.com/(.*)/(.*)'
github_ssh_partten = r'git@github.com:(.*)/(.*).git'
def transfer_github_to_name(address: str):
try:
if address.startswith('https'):
partten = github_http_partten
else:
partten = github_ssh_partten
matchObj = re.match(partten, address, re.M | re.I)
if matchObj:
organization = matchObj.group(1)
repo = matchObj.group(2)
return organization, repo
else:
return None, None
except:
return None, None
def check_github_address(address: str) -> bool:
try:
if address.startswith('https'):
partten = github_http_partten
else:
partten = github_ssh_partten
matchObj = re.match(partten, address, re.M | re.I)
if matchObj:
return True
else:
return False
except:
return False
def github_auth(token: str) -> bool:
if token is None:
return False
url = "https://api.github.com/user"
header = {
'Authorization': 'token ' + token}
resp = crawler.Fetch(url, header=header, way='Get')
if "message" in resp.keys():
return False
else:
return True
def get_github_address_with_token(http_address: str, token: str) -> str:
try:
if not http_address.startswith('http'):
raise Exception('http address is error')
if token == "":
raise Exception('token is empty')
owner_name = http_address[8:].split("/")[1]
return http_address[:8] + owner_name + ":" + token + '@' + http_address[8:]
except Exception as e:
print(e)

100
src/utils/gitlab.py Normal file
View File

@ -0,0 +1,100 @@
import re
import requests
from typing import Optional
from src.base import config
from src.utils.logger import logger
# TODO
gitlab_http_partten = r''
gitlab_ssh_partten = r''
antcode_http_partten = r''
antcode_ssh_partten = r''
gitlab_api_address = config.GITLAB_ENV['gitlab_api_address']
token = config.ACCOUNT['gitlab_token']
def get_inter_repo_type(address: str) -> Optional[str]:
if address.startswith('http://gitlab') or address.startswith('git@gitlab'):
return 'gitlab'
elif address.startswith('https://code') or address.startswith('git@code'):
return 'antcode'
else:
return None
def get_organization_and_name_from_url(url: str):
try:
if url.startswith('http://gitlab'):
partten = gitlab_http_partten
elif url.startswith('git@gitlab'):
partten = gitlab_ssh_partten
elif url.startswith('https://code'):
partten = antcode_http_partten
elif url.startswith('git@code'):
partten = antcode_ssh_partten
else:
partten = None
if partten is not None:
matchObj = re.match(partten, url, re.M | re.I)
if matchObj:
organization = matchObj.group(1)
repo = matchObj.group(2)
return organization, repo
else:
return None, None
except:
return None, None
def check_gitlab_address(url: str) -> bool:
try:
if url.startswith('http://gitlab'):
partten = gitlab_http_partten
elif url.startswith('git@gitlab'):
partten = gitlab_ssh_partten
elif url.startswith('https://code'):
partten = antcode_http_partten
elif url.startswith('git@code'):
partten = antcode_ssh_partten
else:
partten = None
if partten is not None:
matchObj = re.match(partten, url, re.M | re.I)
if matchObj:
return True
else:
return False
except:
return False
def get_repo_id_from_url(url: str) -> Optional[int]:
if url == "":
return None
organization, repo = get_organization_and_name_from_url(url)
if organization is None or repo is None:
logger.info("The url has no organization or repo")
return None
logger.info(f"The url's organization is {organization}")
logger.info(f"The url's repo is {repo}")
expect_namespace = organization + " / " + repo
param = {
'private_token': token,
'url': url
}
response = requests.get(url=gitlab_api_address, params=param).json()
if response is None or len(response) == 0:
logger.info("There is no data from the gitlab request")
return None
for repo_info in response:
if repo_info['name_with_namespace'] == expect_namespace:
return repo_info['id']
return None
def get_gitlab_address_with_token(http_address: str, token: str) -> str:
pass

31
src/utils/gitlink.py Normal file
View File

@ -0,0 +1,31 @@
import re
import requests
from typing import Optional
from src.base import config
from src.utils.logger import logger
gitlink_http_partten = r'https://gitlink.org.cn/(.*)/(.*)'
gitlink_ssh_partten = r'git@code.gitlink.org.cn:(.*)/(.*).git'
def check_gitlink_address(address: str) -> bool:
try:
if address.startswith('https'):
partten = gitlink_http_partten
else:
partten = gitlink_ssh_partten
matchObj = re.match(partten, address, re.M | re.I)
if matchObj:
return True
else:
return False
except:
return False
def gitlink_auth(token: str) -> bool:
pass
def get_gitlink_address_with_token(http_address: str, token: str) -> str:
pass

57
src/utils/logger.py Normal file
View File

@ -0,0 +1,57 @@
import os
import time
from loguru import logger
from typing import Optional
from src.base import config
from src.service.log import LogService
from src.base.code import LogType
basedir = os.path.dirname(os.path.dirname(
os.path.dirname(os.path.abspath(__file__))))
# print(f"log basedir{basedir}") # /xxx/python_code/FastAdmin/backend/app
# locate log file
log_path = os.path.join(basedir, 'logs')
if not os.path.exists(log_path):
os.mkdir(log_path)
log_path_error = os.path.join(
log_path, f'{time.strftime("%Y-%m-%d")}_error.log')
# log config
logger.add(log_path_error, rotation="12:00", retention="5 days", enqueue=True)
def JOB_LOG(sync_job: str, log_type: str, msg: str, commit: str = None):
trace = logger.add(f"{log_path}/sync_job_{sync_job}.log")
if log_type == LogType.INFO:
logger.info(msg)
elif log_type == LogType.ERROR:
logger.error(msg)
elif log_type == LogType.WARNING:
logger.warning(msg)
elif log_type == LogType.DEBUG:
logger.debug(msg)
else:
pass
logger.remove(trace)
return
async def Log(type: str, msg: str, sync_job_id: Optional[int] = None):
# use the function if you want to save git log to database
if type == LogType.INFO:
logger.info(msg)
elif type == LogType.ERROR:
logger.error(msg)
elif type == LogType.WARNING:
logger.warning(msg)
else:
return
if sync_job_id is None:
return
if config.LOG_SAVE:
service = LogService()
await service.save_logs(sync_job_id, type, msg)
return

83
src/utils/sync_log.py Normal file
View File

@ -0,0 +1,83 @@
import os
import time
import logging
basedir = os.path.dirname(os.path.dirname(
os.path.dirname(os.path.abspath(__file__))))
log_path = os.path.join(basedir, 'logs')
if not os.path.exists(log_path):
os.mkdir(log_path)
api_log_name = os.path.join(
log_path, f'{time.strftime("%Y-%m-%d")}_api.log')
sync_log_name = os.path.join(
log_path, f'{time.strftime("%Y-%m-%d")}_sync.log')
class LogType:
INFO = 'info'
ERROR = 'ERROR'
WARNING = 'warning'
DEBUG = "debug"
def sync_log(log_type: str, msg: str, log_name: str, user="robot"):
name = os.path.join(log_path, log_name)
# 创建一个输出到控制台的handler并设置日志级别为INFO
file_handler = logging.FileHandler(name)
# console_handler = logging.StreamHandler()
file_handler.setLevel(logging.INFO)
# 创建一个格式化器,指定日志格式
formatter = logging.Formatter('%(asctime)s | %(levelname)s | %(op_name)s - %(message)s',
datefmt='%Y-%m-%d %H:%M:%S')
file_handler.setFormatter(formatter)
# 创建一个logger
logger = logging.getLogger('logger')
logger.setLevel(logging.INFO)
# 将handler添加到logger
logger.addHandler(file_handler)
user = {'op_name': user}
if log_type == LogType.INFO:
logger.info(msg, extra=user)
elif log_type == LogType.ERROR:
logger.error(msg, extra=user)
elif log_type == LogType.WARNING:
logger.warning(msg, extra=user)
elif log_type == LogType.DEBUG:
logger.debug(msg, extra=user)
else:
pass
logger.removeHandler(file_handler)
return
def api_log(log_type: str, msg: str, user="robot"):
# 创建一个输出到控制台的handler并设置日志级别为INFO
file_handler = logging.FileHandler(api_log_name)
# console_handler = logging.StreamHandler()
file_handler.setLevel(logging.INFO)
# 创建一个格式化器,指定日志格式
formatter = logging.Formatter('%(asctime)s | %(levelname)s | %(op_name)s - %(message)s',
datefmt='%Y-%m-%d %H:%M:%S')
file_handler.setFormatter(formatter)
# 创建一个logger
logger = logging.getLogger('logger')
logger.setLevel(logging.INFO)
# 将handler添加到logger
logger.addHandler(file_handler)
user = {'op_name': user}
if log_type == LogType.INFO:
logger.info(msg, extra=user)
elif log_type == LogType.ERROR:
logger.error(msg, extra=user)
elif log_type == LogType.WARNING:
logger.warning(msg, extra=user)
else:
pass
logger.removeHandler(file_handler)
return

535
sync.py Normal file
View File

@ -0,0 +1,535 @@
import asyncio
import os
import json
from typing import Optional
import requests
import re
from pathlib import Path
from src.service.pull_request import PullRequestService
from src.service.sync import JobService, ProjectService
from src.service.log import LogService
from src.dto.sync import Color, SyncType
from src.dto.sync import Job as JobDTO
from src.dto.pull_request import PullRequest as PullRequestDTO
from src.dto.sync import Project as ProjectDTO
from src.utils import cmd, author, gitlab, github
from src.utils.logger import logger
from src.base import config
from src.utils.logger import Log
from src.base.code import LogType
from src.common.repo import Repo, RepoType
async def apply_diff(project, job, pull: PullRequestDTO, dir):
organization, repo = github.transfer_github_to_name(project.github_address)
baseUrl = ""
if pull.type == RepoType.Github:
baseUrl = f"{config.GITHUB_ENV['github_api_diff_address']}/{organization}/{repo}/pull/"
elif pull.type == RepoType.Gitee:
baseUrl = f"{config.GITEE_ENV['gitee_api_diff_address']}/{organization}/{repo}/pull/"
elif pull.type == RepoType.Gitcode:
pass
diffUrl = baseUrl + str(pull.id) + ".diff"
tmpfile = dir + "/" + str(pull.id) + "_diff"
download_diff_cmd = ""
if pull.type == RepoType.Github:
download_diff_cmd = f"curl -X GET {diffUrl} -H 'Accept: application/vnd.github.v3.diff'"
elif pull.type == RepoType.Gitee:
download_diff_cmd = f"curl -X GET {diffUrl}"
elif pull.type == RepoType.Gitcode:
pass
with open(tmpfile, "w") as diff_file:
diff, err = await cmd.shell(download_diff_cmd, dir, job)
diff_file.write(diff)
# git apply --check first
out, err = await cmd.shell('git apply --check ' + tmpfile, dir, job)
if out != "":
raise ValueError(f"git apply --check failed")
out, err = await cmd.shell('git apply ' + tmpfile, dir, job)
if err.startswith("error"):
await Log(LogType.ERROR, "The git apply operation has some conflict", job.id)
await cmd.shell('rm -rf ' + dir, '.', job)
return
await cmd.shell(f"rm -rf {tmpfile}", dir, job)
await cmd.shell('git add .', dir, job)
async def sync_common(project, job, pull: PullRequestDTO):
try:
await Log(LogType.INFO, f"The project base repo is {project.base}", job.id)
await Log(LogType.INFO, f"Sync the job code from other repo to base {project.base} repo", job.id)
dir = f"/tmp/{job.project}_job_inter_{job.id}_pull_{pull.id}"
await Log(LogType.INFO, f"The pull request dir is {dir}", job.id)
await cmd.shell('mkdir ' + dir, '.', job)
await cmd.shell(
f"git clone -b {job.gitlab_branch} {project.gitlab_address}", dir, job)
repo_dir = dir + "/" + project.name
# GitHub pull request
if project.base != RepoType.Github and project.github_address is not None:
await cmd.shell('git status', repo_dir, job)
new_branch = 'github_pr' + str(pull.id)
await Log(LogType.INFO, f"The new branch is {new_branch}", job.id)
await cmd.shell('git checkout -b ' + new_branch, repo_dir, job)
apply_diff(project, job, pull, repo_dir)
commit_msg = "Github pull request #" + str(pull.id)
await cmd.shell(f"git commit -m \"{commit_msg}\"", repo_dir, job)
await cmd.shell(f"git push -uv origin {new_branch}", repo_dir, job)
inter_type = gitlab.get_inter_repo_type(project.gitlab_address)
if inter_type is None:
await Log(LogType.ERROR,
f"The {project.gitlab_address} is not belong to gitlab or antcode", job.id)
else:
# send a merge request
if inter_type == 'gitlab':
# Alibaba Gitlab repoCode Aone
await Log(LogType.INFO,
f"Merge the pull request to internal Gitlab {project.name}", job.id)
repo_id = gitlab.get_repo_id_from_url(
project.gitlab_address)
if repo_id is None:
await Log(LogType.ERROR,
f"We can not get the repo id {repo_id}", job.id)
await Log(LogType.INFO,
f"The project's gitlab repo id is {repo_id}", job.id)
await Log(LogType.INFO,
f"send the merge request about the pull request #{pull.id}", job.id)
merge_to_code_aone_inter(
repo_id, pull.id, new_branch, job.gitlab_branch)
else:
# Alipay Antcode repo
await Log(LogType.INFO,
f"Merge the pull request to internal Antcode {project.name}", job.id)
organization, name = gitlab.get_organization_and_name_from_url(
project.gitlab_address)
merge_to_antcode_inter(
job.id, pull.id, organization, name, new_branch, job.gitlab_branch)
# update the pull request inline status
service = PullRequestService()
await service.update_inline_status(pull, True)
await service.update_latest_commit(pull)
# Gitee pull request
# TODO: Gitcode pull request
except:
msg = f"The pull request #{pull.id} sync to the internal failed"
await Log(LogType.ERROR, msg, job.id)
finally:
await cmd.shell('rm -rf ' + dir, '.', job)
def merge_to_code_aone_inter(repo_id: int, pull_id: int, source_branch: str, target_branch: str):
url = f""
param = {
'private_token': config.ACCOUNT['gitlab_token'],
}
data = {
"description": "Merge the pull request #" + str(pull_id),
"source_branch": source_branch,
"target_branch": target_branch,
"title": "Merge the pull request #" + str(pull_id)
}
response = requests.post(url=url, params=param,
data=json.dumps(data)).json()
return response
async def merge_to_antcode_inter(job_id: int, pull_id: int, organization, name, source_branch, target_branch: str):
await Log(LogType.INFO,
"send the merge request about the pull request #{pull_id}", job_id)
headers = {
"PRIVATE-TOKEN": config.ACCOUNT['antcode_token']
}
mainSiteUrl = ""
organization = "oceanbase-docs"
name = "oceanbase-doc"
path = f"{organization}/{name}"
req_str = f"{mainSiteUrl}/api/v3/projects/find_with_namespace?path={path}"
response = requests.get(url=req_str, headers=headers).json()
projectId = response['id']
await Log(LogType.INFO, f"The Antcode project ID is {projectId}", job_id)
# merge request
merge_req_str = f"{mainSiteUrl}/api/v3/projects/{projectId}/pull_requests"
param = {
'source_branch': source_branch,
'target_branch': target_branch,
'squash_merge': True
}
response = requests.post(
url=merge_req_str, param=param, headers=headers).json()
return
async def sync_oceanbase(project, job, pull):
title = f"Github merge request #{pull.id} {pull.title}"
# Sync OceanBase code need ob flow
await Log(LogType.INFO, "Sync the oceanbase code from github to inter", job.id)
# Create ob task
create_task_cmd = f"/usr/local/obdev/libexec/ob-task create --subject=\"{title}\" -T bug --branch={job.gitlab_branch} --description=\"{title}\""
out, err = await cmd.shell(create_task_cmd, '.', job, env=dict(os.environ, AONE_ISSUE_NICK='官明'))
issue_num = str.splitlines(out)[1].replace("[issue-id]", "").strip()
await Log(LogType.INFO,
f"The issue number is {issue_num} about the oceanbase pull request {pull.id}", job.id)
task_id = issue_num.replace("T", "")
if task_id != "":
await Log(LogType.INFO,
f"Create the task {task_id} successfully by ob flow", job.id)
await cmd.shell(
f"/usr/local/obdev/libexec/ob-flow start T{task_id} {job.gitlab_branch}", '.', job, env=dict(
os.environ, OB_FLOW_PROJECT='oceanbase'))
task_addr = f"/data/1/wangzelin.wzl/task-{task_id}"
apply_diff(project, job, pull, task_addr)
await cmd.shell(f"git commit -m \"{title}\"", task_addr, job)
await cmd.shell("/usr/local/obdev/libexec/ob-flow checkin", task_addr, job)
service = PullRequestService()
await service.update_inline_status(pull, True)
await service.update_latest_commit(pull)
async def sync_pull_request(project, job):
organization, repo = github.transfer_github_to_name(project.github_address)
if organization and repo:
pull_request_service = PullRequestService()
await pull_request_service.sync_pull_request(project.name, organization, repo)
pull_request_service = PullRequestService()
pull_request_list = await pull_request_service.fetch_pull_request(project=job.project)
if pull_request_list and len(pull_request_list) > 0:
await Log(LogType.INFO,
f"There are {len(pull_request_list)} pull requests in the database", job.id)
for pull in pull_request_list:
if pull.target_branch == job.github_branch:
await Log(LogType.INFO,
f"Judge the pull request #{pull.id} of project {project.name} if need to merge", job.id)
need_merge = await pull_request_service.judge_pull_request_need_merge(project.name, organization, repo, pull.id)
if need_merge:
await Log(LogType.INFO,
f"The pull request #{pull.id} of project {project.name} need merge", job.id)
if job.project == "oceanbase":
# Add a self config to sync the pull request what you want
if pull.id in config.OCEANBASE:
await sync_oceanbase(project, job, pull)
else:
await sync_common(project, job, pull)
else:
await Log(LogType.INFO,
f"The pull request #{pull.id} of project {project.name} does not need merge", job.id)
return
async def sync_inter_code(project, job):
# Judge the repo type
if job.type == SyncType.OneWay:
await sync_oneway_inter_code(project, job)
elif job.type == SyncType.TwoWay:
await sync_twoway_inter_code(project, job)
else:
await Log(LogType.ERROR,
"The job {job.github_branch}'s type of project {project.name} is wrong", job.id)
return
async def sync_oneway_inter_code(project: ProjectDTO, job: JobDTO):
service = JobService()
await Log(LogType.INFO, "Sync the job code to outer", job.id)
dir = f"/data/1/tmp/{job.project}_job_outer_{job.id}"
await Log(LogType.INFO, f"The sync work dir is {dir}", job.id)
try:
await cmd.shell('mkdir ' + dir, '.', job)
await cmd.shell(
f"git clone -b {job.gitlab_branch} {project.gitlab_address} --depth=100", dir, job)
repo_dir = dir + "/" + project.name
await cmd.shell('git status', repo_dir, job)
await cmd.shell(
f"git remote add github {project.github_address}", repo_dir, job)
await cmd.shell('git fetch github', repo_dir, job)
await cmd.shell(
f"git checkout -b out_branch github/{job.github_branch}", repo_dir, job)
await cmd.shell('git checkout ' + job.gitlab_branch, repo_dir, job)
if project.gitee_address:
await cmd.shell(
f"git remote add gitee {project.gitee_address}", repo_dir, job)
await cmd.shell('git fetch gitee', repo_dir, job)
if project.code_china_address:
await cmd.shell(
f"git remote add csdn {project.code_china_address}", repo_dir, job)
result, err = await cmd.shell('git status', repo_dir, job)
# fetch the latest commit
latestCommit = await service.get_job_lateset_commit(job.id)
await Log(LogType.INFO, 'The lastest commit is ' + latestCommit, job.id)
if latestCommit == 'no_commit':
result, err = await cmd.shell(
f"git log HEAD^1..HEAD --oneline --merges", repo_dir, job)
commit = result.split(" ")[0]
await Log(LogType.INFO, f"patch the commit {commit}", job.id)
await patch_every_commit(repo_dir, project, job, commit)
return
else:
result, err = await cmd.shell(
"git log "+latestCommit + "..HEAD --oneline --merges", repo_dir, job)
if result == "":
await Log(LogType.INFO,
f"The commit {latestCommit} is the newest commit on the remote branch", job.id)
else:
commit_info_list = str.splitlines(result)
commit_info_list.reverse()
for commit_info in commit_info_list:
commit = commit_info.split(" ")[0]
await Log(LogType.INFO, "patch the commit " + commit, job.id)
await patch_every_commit(repo_dir, project, job, commit)
except:
msg = f"Sync the code from inter to outer of project {project.name} branch {job.github_branch} failed"
await Log(LogType.ERROR, msg, job.id)
finally:
await cmd.shell(f"rm -rf {dir}", '.', job)
await Log(LogType.INFO, f"remove the temper repo folder {dir}", job.id)
return
async def sync_twoway_inter_code(project, job):
service = JobService()
await Log(LogType.INFO, "Sync the job document to outer", job.id)
dir = "/tmp/" + job.project+"_job_outer_"+str(job.id)
await Log(LogType.INFO, f"The sync work dir is {dir}", job.id)
try:
await cmd.shell('mkdir ' + dir, '.', job)
await cmd.shell(
f"git clone -b {job.gitlab_branch} {project.gitlab_address}", dir, job)
repo_dir = dir + "/" + project.name
await cmd.shell('git status', repo_dir, job)
await cmd.shell('git remote add github ' +
project.github_address, repo_dir, job)
await cmd.shell('git fetch github', repo_dir, job)
await cmd.shell('git pull -r github ' + job.github_branch, repo_dir, job)
await cmd.shell(
f"git push origin {job.gitlab_branch} -f", repo_dir, job)
await cmd.shell(
f"git push github {job.github_branch} -f", repo_dir, job)
if project.gitee_address:
await cmd.shell('git remote add gitee ' +
project.gitee_address, repo_dir, job)
await cmd.shell('git fetch gitee', repo_dir, job)
await cmd.shell(
f"git push gitee {job.gitlab_branch}:{job.gitee_branch}", repo_dir, job)
if project.code_china_address:
await cmd.shell('git remote add csdn ' +
project.code_china_address, repo_dir, job)
result, err = await cmd.shell('git status', repo_dir, job)
await cmd.shell(
f"git push csdn {job.gitlab_branch}:{job.code_china_branch}", repo_dir, job)
# update the latest commit hash
result, err = await cmd.shell(
"git log HEAD~1..HEAD --oneline", repo_dir, job)
commit = result.split(" ")[0]
await service.update_job_lateset_commit(job.id, commit)
except:
msg = f"Sync the document from inter to outer of project {project.name} branch {job.github_branch} failed"
await Log(LogType.Error, msg, job.id)
finally:
await cmd.shell(f"rm -rf {dir}", '.', job)
await Log(LogType.INFO, f"remove the temper repo folder {dir}", job.id)
return
def get_author_from_oceanbase(author_content: str) -> Optional[str]:
partten = r'Author : (.*) \((.*)\)'
matchObj = re.match(partten, author_content, re.M | re.I)
if matchObj:
author = matchObj.group(2)
return author.split('#')[0]
return None
async def patch_every_commit(dir, project, job, commit):
service = JobService()
try:
await cmd.shell('git status', dir, job)
await cmd.shell('git checkout ' + job.gitlab_branch, dir, job)
await cmd.shell('git pull -r origin ' + job.gitlab_branch, dir, job)
await cmd.shell('git reset --hard ' + commit, dir, job)
# Get the commit comment
output, err = await cmd.shell("git log -1", dir, job)
email, err = await cmd.shell("git log --format='%ae' -1", dir, job)
if email is None:
raise ValueError("The commit has no email")
await Log(LogType.INFO, f"The commit {commit} email is {email}", job.id)
if project.name == "oceanbase":
author_string = str.splitlines(output)[8].strip()
await Log(LogType.INFO,
f"The author string is {author_string}", job.id)
domain = get_author_from_oceanbase(author_string)
else:
domain = author.get_author_domain(email)
if domain is None:
raise ValueError("The commit author has no ali domain")
await Log(LogType.INFO, f"The commit author ali domain is {domain}", job.id)
content = str.splitlines(output)[5].strip()
await Log(LogType.INFO, f"content is {content}", job.id)
if content is None or content == "":
raise ValueError("The commit has no commit content")
await Log(LogType.INFO, f"The commit {commit} content is {content}", job.id)
# TODO if find the commit is from github, merge the pull request
if content.startswith("Github Merge"):
pr_id = int(content.split()[3].replace('#', ''))
pr_service = PullRequestService()
organization, repo = github.transfer_github_to_name(
project.github_address)
ans = await pr_service.merge_pull_request_code(organization, repo, pr_id)
if ans is None:
return
# if the repo has .ce file, it means we should do something before merge
# the code from inter to outer
ce_file = Path(dir + '/.ce')
if ce_file.is_file():
await cmd.shell('bash .ce', dir, job)
else:
await Log(LogType.INFO,
f"There is no .ce file in the project {project.name}", job.id)
# TODO check git diff apply --check
diff, err = await cmd.shell("git diff out_branch", dir, job)
if diff == "":
# The diff is empty, save the commit and return
await cmd.shell('git reset --hard', dir, job)
await service.update_job_lateset_commit(job.id, commit)
return
patch_file = '/tmp/' + job.github_branch + '_patch'
await cmd.shell('rm -rf ' + patch_file, dir, job)
with open(patch_file, "w") as diff_file:
diff, err = await cmd.shell("git diff out_branch", dir, job)
diff_file.write(diff)
await cmd.shell('git reset --hard', dir, job)
await cmd.shell('git checkout out_branch', dir, job)
# git apply --check first
# out, err = await cmd.shell('git apply --check ' + patch_file, dir, job)
if err != "":
raise ValueError(
f"The commit {commit} has conflict to the branch {job.github_branch}")
await cmd.shell('git apply ' + patch_file, dir, job)
await cmd.shell('git add .', dir, job)
await cmd.shell(f"git commit -m \"{content}\"", dir, job)
# TODO:change commit author
out = await author.get_github_author_and_email(domain)
if out['author'] is None or out['email'] is None:
await Log(LogType.ERROR, f"The commit has no correct author or email", job.id)
raise ValueError("That is not a positive author or email")
await Log(LogType.INFO,
f"Get the commit author {out['author']} and email {out['email']}", job.id)
author_info = f"{out['author']} <{out['email']}>"
await cmd.shell(
f"git commit --amend --no-edit --author=\"{author_info}\"", dir, job)
await cmd.shell(f"git pull -r github {job.github_branch}", dir, job)
await cmd.shell(f"git push -u github out_branch:{job.github_branch}", dir, job)
if job.gitee_branch is not None:
await cmd.shell(f"git pull -r gitee {job.gitee_branch}", dir, job)
await cmd.shell(f"git push -u gitee out_branch:{job.gitee_branch}", dir, job)
if job.code_china_branch is not None:
await cmd.shell(f"git pull -r csdn {job.code_china_branch}", dir, job)
await cmd.shell(f"git push -u csdn out_branch:{job.code_china_branch}", dir, job)
await cmd.shell(f"git checkout {job.gitlab_branch}", dir, job)
# save the latest commit
ans = await service.update_job_lateset_commit(job.id, commit)
if ans:
await Log(LogType.INFO,
f"Update the latest commit {commit} successfully", job.id)
except:
msg = f"Sync the commit {commit} of project {project.name} failed"
await Log(LogType.ERROR, msg, job.id)
return
async def sync_job(job: JobDTO):
project_service = ProjectService()
project = await project_service.search_project(name=job.project)
if len(project) == 0:
await Log(LogType.INFO, "There are no projects in the database", job.id)
return
# 1. sync the outer pull request into inter
if job.type == SyncType.OneWay:
await sync_pull_request(project[0], job)
# 2. sync the inter code into outer
await sync_inter_code(project[0], job)
async def sync():
logger.info("Start syncing ****************************")
log_service = LogService()
await log_service.delete_logs()
# fetch the sync job list
service = JobService()
jobs = await service.list_jobs()
if jobs is None:
logger.info(f"There are no sync jobs in the database")
return
logger.info(f"There are {len(jobs)} sync jobs in the database")
tasks = []
for job in jobs:
# if the job status is green, it means we can sync the job
if job.status == Color.green:
await Log(LogType.INFO,
f"The github branch {job.github_branch} from {job.project} is now syncing", job.id)
task = asyncio.create_task(sync_job(job))
tasks.append(task)
else:
await Log(LogType.INFO,
f"The github branch {job.github_branch} from {job.project} does not need to sync", job.id)
for task in tasks:
await task
logger.info("End syncing ****************************")
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(sync())

372
sync/diff_logic_demo.py Normal file
View File

@ -0,0 +1,372 @@
# The oceanbase.py script is used to sync the oceanbase project
import asyncio
import json
from typing import Optional
import requests
import re
from pathlib import Path
import sys
sys.path.append('..') # NOQA: E402
from src.service.pull_request import PullRequestService
from src.service.sync import JobService, ProjectService
from src.service.log import LogService
from src.dto.sync import Color, SyncType
from src.dto.sync import Job as JobDTO
from src.dto.pull_request import PullRequest as PullRequestDTO
from src.dto.sync import Project as ProjectDTO
from src.utils import cmd, author, gitlab, github
from src.utils.logger import logger
from src.base import config
from src.utils.logger import Log
from src.base.code import LogType
from src.common.repo import Repo, RepoType
async def apply_diff(project, job, pull: PullRequestDTO, dir):
organization, repo = github.transfer_github_to_name(project.github_address)
baseUrl = ""
if pull.type == RepoType.Github:
baseUrl = f"{config.GITHUB_ENV['github_api_diff_address']}/{organization}/{repo}/pull/"
elif pull.type == RepoType.Gitee:
baseUrl = f"{config.GITEE_ENV['gitee_api_diff_address']}/{organization}/{repo}/pull/"
elif pull.type == RepoType.Gitcode:
pass
diffUrl = baseUrl + str(pull.id) + ".diff"
tmpfile = dir + "/" + str(pull.id) + "_diff"
download_diff_cmd = ""
if pull.type == RepoType.Github:
download_diff_cmd = f"curl -X GET {diffUrl} -H 'Accept: application/vnd.github.v3.diff'"
elif pull.type == RepoType.Gitee:
download_diff_cmd = f"curl -X GET {diffUrl}"
elif pull.type == RepoType.Gitcode:
pass
with open(tmpfile, "w") as diff_file:
diff, err = await cmd.shell(download_diff_cmd, dir, job)
diff_file.write(diff)
# git apply --check first
out, err = await cmd.shell('git apply --check ' + tmpfile, dir, job)
if out != "":
raise ValueError(f"git apply --check failed")
out, err = await cmd.shell('git apply ' + tmpfile, dir, job)
if err.startswith("error"):
await Log(LogType.ERROR, "The git apply operation has some conflict", job.id)
await cmd.shell('rm -rf ' + dir, '.', job)
return
await cmd.shell(f"rm -rf {tmpfile}", dir, job)
await cmd.shell('git add .', dir, job)
async def sync_common(project, job, pull: PullRequestDTO):
try:
await Log(LogType.INFO, f"The project base repo is {project.base}", job.id)
await Log(LogType.INFO, f"Sync the job code from other repo to base {project.base} repo", job.id)
dir = f"/tmp/{job.project}_job_inter_{job.id}_pull_{pull.id}"
await Log(LogType.INFO, f"The pull request dir is {dir}", job.id)
await cmd.shell('mkdir ' + dir, '.', job)
await cmd.shell(
f"git clone -b {job.gitlab_branch} {project.gitlab_address}", dir, job)
repo_dir = dir + "/" + project.name
# GitHub pull request
if project.base != RepoType.Github and project.github_address is not None:
await cmd.shell('git status', repo_dir, job)
new_branch = 'github_pr' + str(pull.id)
await Log(LogType.INFO, f"The new branch is {new_branch}", job.id)
await cmd.shell('git checkout -b ' + new_branch, repo_dir, job)
apply_diff(project, job, pull, repo_dir)
commit_msg = "Github pull request #" + str(pull.id)
await cmd.shell(f"git commit -m \"{commit_msg}\"", repo_dir, job)
await cmd.shell(f"git push -uv origin {new_branch}", repo_dir, job)
inter_type = gitlab.get_inter_repo_type(project.gitlab_address)
if inter_type is None:
await Log(LogType.ERROR,
f"The {project.gitlab_address} is not belong to gitlab or antcode", job.id)
else:
# send a merge request
# TODO base one your base repo type
# update the pull request inline status
service = PullRequestService()
await service.update_inline_status(pull, True)
await service.update_latest_commit(pull)
# Gitee pull request
# TODO: Gitcode pull request
except:
msg = f"The pull request #{pull.id} sync to the internal failed"
await Log(LogType.ERROR, msg, job.id)
finally:
await cmd.shell('rm -rf ' + dir, '.', job)
async def sync_pull_request(project, job):
organization, repo = github.transfer_github_to_name(project.github_address)
if organization and repo:
pull_request_service = PullRequestService()
await pull_request_service.sync_pull_request(project.name, organization, repo)
pull_request_service = PullRequestService()
pull_request_list = await pull_request_service.fetch_pull_request(project=job.project)
if pull_request_list and len(pull_request_list) > 0:
await Log(LogType.INFO,
f"There are {len(pull_request_list)} pull requests in the database", job.id)
for pull in pull_request_list:
if pull.target_branch == job.github_branch:
await Log(LogType.INFO,
f"Judge the pull request #{pull.id} of project {project.name} if need to merge", job.id)
need_merge = await pull_request_service.judge_pull_request_need_merge(project.name, organization, repo, pull.id)
if need_merge:
await Log(LogType.INFO,
f"The pull request #{pull.id} of project {project.name} need merge", job.id)
await sync_common(project, job, pull)
else:
await Log(LogType.INFO,
f"The pull request #{pull.id} of project {project.name} does not need merge", job.id)
return
async def sync_inter_code(project, job):
# Judge the repo type
if job.type == SyncType.OneWay:
await sync_inter_code_by_diff(project, job)
else:
await Log(LogType.ERROR,
"The job {job.github_branch}'s type of project {project.name} is wrong", job.id)
return
async def sync_inter_code_by_diff(project: ProjectDTO, job: JobDTO):
service = JobService()
await Log(LogType.INFO, "Sync the job code to outer", job.id)
dir = f"/data/1/tmp/{job.project}_job_outer_{job.id}"
await Log(LogType.INFO, f"The sync work dir is {dir}", job.id)
try:
await cmd.shell('mkdir ' + dir, '.', job)
await cmd.shell(
f"git clone -b {job.gitlab_branch} {project.gitlab_address} --depth=100", dir, job)
repo_dir = dir + "/" + project.name
await cmd.shell('git status', repo_dir, job)
await cmd.shell(
f"git remote add github {project.github_address}", repo_dir, job)
await cmd.shell('git fetch github', repo_dir, job)
await cmd.shell(
f"git checkout -b out_branch github/{job.github_branch}", repo_dir, job)
await cmd.shell('git checkout ' + job.gitlab_branch, repo_dir, job)
if project.gitee_address:
await cmd.shell(
f"git remote add gitee {project.gitee_address}", repo_dir, job)
await cmd.shell('git fetch gitee', repo_dir, job)
if project.code_china_address:
await cmd.shell(
f"git remote add csdn {project.code_china_address}", repo_dir, job)
result, err = await cmd.shell('git status', repo_dir, job)
# fetch the latest commit
latestCommit = await service.get_job_lateset_commit(job.id)
await Log(LogType.INFO, 'The lastest commit is ' + latestCommit, job.id)
if latestCommit == 'no_commit':
result, err = await cmd.shell(
f"git log HEAD^1..HEAD --oneline --merges", repo_dir, job)
commit = result.split(" ")[0]
await Log(LogType.INFO, f"patch the commit {commit}", job.id)
await patch_every_commit(repo_dir, project, job, commit)
return
else:
result, err = await cmd.shell(
"git log "+latestCommit + "..HEAD --oneline --merges", repo_dir, job)
if result == "":
await Log(LogType.INFO,
f"The commit {latestCommit} is the newest commit on the remote branch", job.id)
else:
commit_info_list = str.splitlines(result)
commit_info_list.reverse()
for commit_info in commit_info_list:
commit = commit_info.split(" ")[0]
await Log(LogType.INFO, "patch the commit " + commit, job.id)
await patch_every_commit(repo_dir, project, job, commit)
except:
msg = f"Sync the code from inter to outer of project {project.name} branch {job.github_branch} failed"
await Log(LogType.ERROR, msg, job.id)
finally:
# await cmd.shell(f"rm -rf {dir}", '.', job)
await Log(LogType.INFO, f"remove the temper repo folder {dir}", job.id)
return
async def patch_every_commit(dir, project, job, commit):
service = JobService()
try:
await cmd.shell('git status', dir, job)
await cmd.shell('git checkout ' + job.gitlab_branch, dir, job)
await cmd.shell('git pull -r origin ' + job.gitlab_branch, dir, job)
await cmd.shell('git reset --hard ' + commit, dir, job)
# Get the commit comment
output, err = await cmd.shell("git log -1", dir, job)
email, err = await cmd.shell("git log --format='%ae' -1", dir, job)
if email is None:
raise ValueError("The commit has no email")
await Log(LogType.INFO, f"The commit {commit} email is {email}", job.id)
domain = author.get_author_domain(email)
if domain is None:
raise ValueError("The commit author has no ali domain")
await Log(LogType.INFO, f"The commit author ali domain is {domain}", job.id)
content = str.splitlines(output)[5].strip()
await Log(LogType.INFO, f"content is {content}", job.id)
if content is None or content == "":
raise ValueError("The commit has no commit content")
await Log(LogType.INFO, f"The commit {commit} content is {content}", job.id)
# TODO if find the commit is from github, merge the pull request
if content.startswith("Github Merge"):
pr_id = int(content.split()[4].replace('#', ''))
pr_service = PullRequestService()
organization, repo = github.transfer_github_to_name(
project.github_address)
ans = await pr_service.merge_pull_request_code(organization, repo, pr_id)
if ans is None:
return
# if the repo has .ce file, it means we should do something before merge
# the code from inter to outer
ce_file = Path(dir + '/.ce')
if ce_file.is_file():
await cmd.shell('bash .ce', dir, job)
else:
await Log(LogType.INFO,
f"There is no .ce file in the project {project.name}", job.id)
# TODO check git diff apply --check
diff, err = await cmd.shell("git diff out_branch", dir, job)
if diff == "":
# The diff is empty, save the commit and return
await cmd.shell('git reset --hard', dir, job)
await service.update_job_lateset_commit(job.id, commit)
return
patch_file = '/tmp/' + job.github_branch + '_patch'
await cmd.shell('rm -rf ' + patch_file, dir, job)
with open(patch_file, "w") as diff_file:
diff, err = await cmd.shell("git diff out_branch", dir, job)
diff_file.write(diff)
await cmd.shell('git reset --hard', dir, job)
await cmd.shell('git checkout out_branch', dir, job)
# git apply --check first
# out, err = await cmd.shell('git apply --check ' + patch_file, dir, job)
if err != "":
raise ValueError(
f"The commit {commit} has conflict to the branch {job.github_branch}")
await cmd.shell('git apply ' + patch_file, dir, job)
await cmd.shell('git add .', dir, job)
await cmd.shell(f"git commit -m \"{content}\"", dir, job)
# TODO:change commit author
out = await author.get_github_author_and_email(domain)
if out['author'] is None or out['email'] is None:
await Log(LogType.ERROR, f"The commit has no correct author or email", job.id)
raise ValueError("That is not a positive author or email")
await Log(LogType.INFO,
f"Get the commit author {out['author']} and email {out['email']}", job.id)
author_info = f"{out['author']} <{out['email']}>"
await cmd.shell(
f"git commit --amend --no-edit --author=\"{author_info}\"", dir, job)
await cmd.shell(f"git pull -r github {job.github_branch}", dir, job)
await cmd.shell(f"git push -u github out_branch:{job.github_branch}", dir, job)
if job.gitee_branch is not None:
await cmd.shell(f"git pull -r gitee {job.gitee_branch}", dir, job)
await cmd.shell(f"git push -u gitee out_branch:{job.gitee_branch}", dir, job)
if job.code_china_branch is not None:
await cmd.shell(f"git pull -r csdn {job.code_china_branch}", dir, job)
await cmd.shell(f"git push -u csdn out_branch:{job.code_china_branch}", dir, job)
await cmd.shell(f"git checkout {job.gitlab_branch}", dir, job)
# save the latest commit
ans = await service.update_job_lateset_commit(job.id, commit)
if ans:
await Log(LogType.INFO,
f"Update the latest commit {commit} successfully", job.id)
except:
msg = f"Sync the commit {commit} of project {project.name} failed"
await Log(LogType.ERROR, msg, job.id)
return
async def sync_job(job: JobDTO):
project_service = ProjectService()
project = await project_service.search_project(name=job.project)
if len(project) == 0:
await Log(LogType.INFO, "There are no projects in the database", job.id)
return
# 1. sync the outer pull request into inter
if job.type == SyncType.OneWay:
await sync_pull_request(project[0], job)
# 2. sync the inter code into outer
await sync_inter_code(project[0], job)
async def sync():
logger.info("Start syncing ****************************")
log_service = LogService()
await log_service.delete_logs()
# fetch the sync job list
service = JobService()
jobs = await service.list_jobs()
if jobs is None:
logger.info(f"There are no sync jobs in the database")
return
logger.info(f"There are {len(jobs)} sync jobs in the database")
tasks = []
for job in jobs:
# if the job status is green, it means we can sync the job
if job.status == Color.green:
await Log(LogType.INFO,
f"The github branch {job.github_branch} from {job.project} is now syncing", job.id)
task = asyncio.create_task(sync_job(job))
tasks.append(task)
else:
await Log(LogType.INFO,
f"The github branch {job.github_branch} from {job.project} does not need to sync", job.id)
for task in tasks:
await task
logger.info("End syncing ****************************")
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(sync())

131
sync/merge_logic_demo.py Normal file
View File

@ -0,0 +1,131 @@
# The document.py script is used to sync the document project from oceanbase
import asyncio
import sys
sys.path.append('..') # NOQA: E402
from src.service.sync import JobService, ProjectService
from src.service.log import LogService
from src.dto.sync import Color, SyncType
from src.dto.sync import Job as JobDTO
from src.dto.sync import Project as ProjectDTO
from src.utils import cmd, github
from src.utils import cmd, gitee
from src.utils.logger import logger
from src.base import config
from src.utils.logger import Log
from src.base.code import LogType
async def sync_inter_code(project, job):
if job.type == SyncType.TwoWay:
await sync_inter_code_by_rabase(project, job)
else:
await Log(LogType.ERROR,
"The job {job.github_branch}'s type of project {project.name} is wrong", job.id)
return
async def sync_inter_code_by_rabase(project: ProjectDTO, job: JobDTO):
service = JobService()
await Log(LogType.INFO, "Sync the job document to outer", job.id)
dir = "/tmp/" + job.project+"_job_outer_"+str(job.id)
await Log(LogType.INFO, f"The sync work dir is {dir}", job.id)
try:
await cmd.shell('mkdir ' + dir, '.', job)
# base on the github new authentication rules
# provide only two ways to clone and push code
# 1. ssh key and ssh address
# 2. http address with token
github_address = github.get_github_address_with_token(project.github_address, project.github_token)
await cmd.shell(
f"git clone -b {job.github_branch} {github_address} {project.name}", dir, job)
repo_dir = dir + "/" + project.name
# if you need http address with token
# github_address = github.get_github_address_with_token(project.github_address, project.github_token)
await cmd.shell('git status', repo_dir, job)
# if gitee is not null, sync it
if project.gitee_address:
gitee_address = gitee.get_gitee_address_with_token(project.gitee_address, project.gitee_token)
await cmd.shell('git remote add gitee ' +
gitee_address, repo_dir, job)
await cmd.shell('git fetch gitee', repo_dir, job)
await cmd.shell(f"git pull gitee {job.gitee_branch} --no-edit", repo_dir, job)
# if gitcode is not null, sync it
if project.code_china_address:
await cmd.shell('git remote add csdn ' +
project.code_china_address, repo_dir, job)
await cmd.shell('git fetch csdn', repo_dir, job)
await cmd.shell(f"git pull csdn {job.code_china_branch} --no-edit", repo_dir, job)
await cmd.shell(
f"git push origin {job.github_branch}", repo_dir, job)
if project.gitee_address:
await cmd.shell(
f"git push gitee {job.github_branch}:{job.gitee_branch}", repo_dir, job)
if project.code_china_address:
await cmd.shell(
f"git push csdn {job.github_branch}:{job.code_china_branch}", repo_dir, job)
# update the latest commit hash
# for rebase logic maybe is not useful
result, err = await cmd.shell(
"git log HEAD~1..HEAD --oneline", repo_dir, job)
commit = result.split(" ")[0]
await service.update_job_lateset_commit(job.id, commit)
except:
msg = f"Sync the code from inter to outer of project {project.name} branch {job.github_branch} failed"
await Log(LogType.ERROR, msg, job.id)
finally:
await cmd.shell(f"rm -rf {dir}", '.', job)
await Log(LogType.INFO, f"remove the temper repo folder {dir}", job.id)
return
async def sync_job(job: JobDTO):
project_service = ProjectService()
project = await project_service.search_project(name=job.project)
if len(project) == 0:
await Log(LogType.INFO, "There are no projects in the database", job.id)
return
# 1. The git rabase logic does not need to fetch pull request.
# 2. sync the inter code into outer
await sync_inter_code(project[0], job)
async def sync():
logger.info("Start syncing ****************************")
log_service = LogService()
await log_service.delete_logs()
# fetch the sync job list
service = JobService()
jobs = await service.list_jobs()
if jobs is None:
logger.info(f"There are no sync jobs in the database")
return
logger.info(f"There are {len(jobs)} sync jobs in the database")
tasks = []
for job in jobs:
# if the job status is green, it means we can sync the job
if job.status == Color.green:
await Log(LogType.INFO,
f"The github branch {job.github_branch} from {job.project} is now syncing", job.id)
task = asyncio.create_task(sync_job(job))
tasks.append(task)
else:
await Log(LogType.INFO,
f"The github branch {job.github_branch} from {job.project} does not need to sync", job.id)
for task in tasks:
await task
logger.info("End syncing ****************************")
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(sync())

View File

@ -0,0 +1,171 @@
# The document.py script is used to sync the document project from oceanbase
import asyncio
import sys
import json
import time
sys.path.append('..') # NOQA: E402
from src.service.sync import JobService, ProjectService
from src.service.log import LogService
from src.dto.sync import Color, SyncType
from src.dto.sync import Job as JobDTO
from src.dto.sync import Project as ProjectDTO
from src.utils import cmd, github
from src.utils import cmd, gitee
from src.utils import cmd, gitlink
from src.utils.logger import logger
from src.base import config
from src.utils.logger import Log
from src.base.code import LogType
async def sync_inter_code(project, job):
if job.type == SyncType.TwoWay:
await sync_inter_code_by_rabase(project, job)
else:
await Log(LogType.ERROR,
"The job {job.github_branch}'s type of project {project.name} is wrong", job.id)
return
async def sync_inter_code_by_rabase(project: ProjectDTO, job: JobDTO):
service = JobService()
await Log(LogType.INFO, "Sync the job document to outer", job.id)
dir = "/tmp/" + job.project+"_job_outer_"+str(job.id)
await Log(LogType.INFO, f"The sync work dir is {dir}", job.id)
try:
await cmd.shell('mkdir ' + dir, '.', job)
# base on the github new authentication rules
# provide only two ways to clone and push code
# 1. ssh key and ssh address
# 2. http address with token
#
await cmd.shell(
f"git clone -b {job.gitlink_branch} {project.gitlink_address} {project.name}", dir, job)
repo_dir = dir + "/" + project.name
# if you need http address with token
# github_address = github.get_github_address_with_token(project.github_address, project.github_token)
await cmd.shell('git status', repo_dir, job)
# if github is not null, sync it
if project.github_address and job.github_branch:
github_address = github.get_github_address_with_token(project.github_address, project.github_token)
await cmd.shell('git remote add github ' +
github_address, repo_dir, job)
await cmd.shell('git fetch github', repo_dir, job)
await cmd.shell(f"git pull github {job.github_branch} --no-edit", repo_dir, job)
# if gitee is not null, sync it
if project.gitee_address and job.gitee_branch:
# gitee
gitee_address = gitee.get_gitee_address_with_token(project.gitee_address, project.gitee_token)
await cmd.shell('git remote add gitee ' +
gitee_address, repo_dir, job)
await cmd.shell('git fetch gitee', repo_dir, job)
await cmd.shell(f"git pull gitee {job.gitee_branch} --no-edit", repo_dir, job)
# # if gitcode is not null, sync it
# if project.code_china_address:
# await cmd.shell('git remote add csdn ' +
# project.code_china_address, repo_dir, job)
# await cmd.shell('git fetch csdn', repo_dir, job)
# await cmd.shell(f"git pull csdn {job.code_china_branch} --no-edit", repo_dir, job)
await cmd.shell(
f"git push origin {job.gitlink_branch}", repo_dir, job)
if project.github_address and job.github_branch:
await cmd.shell(
f"git push github {job.gitlink_branch}:{job.github_branch}", repo_dir, job)
if project.gitee_address and job.gitee_branch:
await cmd.shell(
f"git push gitee {job.gitlink_branch}:{job.gitee_branch}", repo_dir, job)
if project.code_china_address:
await cmd.shell(
f"git push csdn {job.github_branch}:{job.code_china_branch}", repo_dir, job)
# update the latest commit hash
# for rebase logic maybe is not useful
result, err = await cmd.shell(
"git log HEAD~1..HEAD --oneline", repo_dir, job)
commit = result.split(" ")[0]
await service.update_job_lateset_commit(job.id, commit)
except Exception as e:
msg = f"Sync the code from inter to outer of project {project.name} branch {job.github_branch} failed {e}"
await Log(LogType.ERROR, msg, job.id)
finally:
await cmd.shell(f"rm -rf {dir}", '.', job)
await Log(LogType.INFO, f"remove the temper repo folder {dir}", job.id)
return
async def sync_job(job: JobDTO):
project_service = ProjectService()
project = await project_service.search_project(name=job.project)
if len(project) == 0:
await Log(LogType.INFO, "There are no projects in the database", job.id)
return
# 1. The git rabase logic does not need to fetch pull request.
# 2. sync the inter code into outer
await sync_inter_code(project[0], job)
async def sync():
logger.info("Start syncing ****************************")
log_service = LogService()
await log_service.delete_logs()
# fetch the sync job list
service = JobService()
# jobs = await service.list_jobs()
# if jobs is None:
# logger.info(f"There are no sync jobs in the database")
# return
# logger.info(f"There are {len(jobs)} sync jobs in the database")
#
# tasks = []
# for job in jobs:
# # if the job status is green, it means we can sync the job
# if job.status == Color.green:
# await Log(LogType.INFO,
# f"The gitlink branch {job.gitlink_branch} from {job.project} is now syncing", job.id)
# task = asyncio.create_task(sync_job(job))
# tasks.append(task)
# else:
# await Log(LogType.INFO,
# f"The gitlink branch {job.gitlink_branch} from {job.project} does not need to sync", job.id)
gitee_jobs = await service.source_list_jobs("gitee_branch")
github_jobs = await service.source_list_jobs("github_branch")
if gitee_jobs is None and github_jobs is None:
logger.info(f"There are no sync jobs in the database")
return
logger.info(f"There are {len(gitee_jobs) + len(github_jobs)} sync jobs in the database")
tasks = []
for job in gitee_jobs:
await Log(LogType.INFO,
f"The gitlink branch {job.gitlink_branch} from {job.project} is now syncing...", job.id)
task = asyncio.create_task(sync_job(job))
tasks.append(task)
for task in tasks:
await task
time.sleep(10)
logger.info("step 2 syncing.................")
tasks2 = []
for job in github_jobs:
await Log(LogType.INFO,
f"The gitlink branch {job.gitlink_branch} from {job.project} is now syncing", job.id)
task = asyncio.create_task(sync_job(job))
tasks2.append(task)
for task in tasks2:
await task
logger.info("End syncing ****************************")
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(sync())

Some files were not shown because too many files have changed in this diff Show More