2019-05-07 09:12:51 +08:00
|
|
|
// Copyright 2019 The Gitea Authors. All rights reserved.
|
|
|
|
// Copyright 2018 Jonas Franz. All rights reserved.
|
|
|
|
// Use of this source code is governed by a MIT-style
|
|
|
|
// license that can be found in the LICENSE file.
|
|
|
|
|
|
|
|
package migrations
|
|
|
|
|
|
|
|
import (
|
2019-12-17 12:16:54 +08:00
|
|
|
"context"
|
2019-05-07 09:12:51 +08:00
|
|
|
"fmt"
|
|
|
|
"io"
|
|
|
|
"os"
|
|
|
|
"path/filepath"
|
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-18 01:08:35 +08:00
|
|
|
"strconv"
|
2019-05-07 09:12:51 +08:00
|
|
|
"strings"
|
|
|
|
"time"
|
|
|
|
|
|
|
|
"code.gitea.io/gitea/models"
|
2021-09-19 19:49:59 +08:00
|
|
|
"code.gitea.io/gitea/models/db"
|
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-18 01:08:35 +08:00
|
|
|
"code.gitea.io/gitea/models/foreignreference"
|
2022-03-31 17:20:39 +08:00
|
|
|
issues_model "code.gitea.io/gitea/models/issues"
|
2021-11-19 21:39:57 +08:00
|
|
|
repo_model "code.gitea.io/gitea/models/repo"
|
2021-11-24 17:49:20 +08:00
|
|
|
user_model "code.gitea.io/gitea/models/user"
|
2019-05-07 09:12:51 +08:00
|
|
|
"code.gitea.io/gitea/modules/git"
|
|
|
|
"code.gitea.io/gitea/modules/log"
|
2021-11-16 23:25:33 +08:00
|
|
|
base "code.gitea.io/gitea/modules/migration"
|
2020-01-12 20:11:17 +08:00
|
|
|
repo_module "code.gitea.io/gitea/modules/repository"
|
2019-05-07 09:12:51 +08:00
|
|
|
"code.gitea.io/gitea/modules/setting"
|
2020-08-18 12:23:45 +08:00
|
|
|
"code.gitea.io/gitea/modules/storage"
|
2019-10-13 21:23:14 +08:00
|
|
|
"code.gitea.io/gitea/modules/structs"
|
2019-08-15 22:46:21 +08:00
|
|
|
"code.gitea.io/gitea/modules/timeutil"
|
2020-12-27 11:34:19 +08:00
|
|
|
"code.gitea.io/gitea/modules/uri"
|
2020-10-28 05:34:56 +08:00
|
|
|
"code.gitea.io/gitea/services/pull"
|
2019-05-07 09:12:51 +08:00
|
|
|
|
2020-06-18 17:18:44 +08:00
|
|
|
gouuid "github.com/google/uuid"
|
2019-05-07 09:12:51 +08:00
|
|
|
)
|
|
|
|
|
2022-01-21 01:46:10 +08:00
|
|
|
var _ base.Uploader = &GiteaLocalUploader{}
|
2019-05-07 09:12:51 +08:00
|
|
|
|
|
|
|
// GiteaLocalUploader implements an Uploader to gitea sites
|
|
|
|
type GiteaLocalUploader struct {
|
2019-12-17 12:16:54 +08:00
|
|
|
ctx context.Context
|
2021-11-24 17:49:20 +08:00
|
|
|
doer *user_model.User
|
2019-10-14 14:10:42 +08:00
|
|
|
repoOwner string
|
|
|
|
repoName string
|
2021-12-10 09:27:50 +08:00
|
|
|
repo *repo_model.Repository
|
2022-06-13 17:37:59 +08:00
|
|
|
labels map[string]*issues_model.Label
|
2022-02-04 03:18:18 +08:00
|
|
|
milestones map[string]int64
|
2022-06-13 17:37:59 +08:00
|
|
|
issues map[int64]*issues_model.Issue
|
2019-10-14 14:10:42 +08:00
|
|
|
gitRepo *git.Repository
|
|
|
|
prHeadCache map[string]struct{}
|
2022-02-06 17:05:29 +08:00
|
|
|
sameApp bool
|
2019-10-14 14:10:42 +08:00
|
|
|
userMap map[int64]int64 // external user id mapping to user id
|
2022-06-13 17:37:59 +08:00
|
|
|
prCache map[int64]*issues_model.PullRequest
|
2019-10-14 14:10:42 +08:00
|
|
|
gitServiceType structs.GitServiceType
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
// NewGiteaLocalUploader creates an gitea Uploader via gitea API v1
|
2021-11-24 17:49:20 +08:00
|
|
|
func NewGiteaLocalUploader(ctx context.Context, doer *user_model.User, repoOwner, repoName string) *GiteaLocalUploader {
|
2019-05-07 09:12:51 +08:00
|
|
|
return &GiteaLocalUploader{
|
2019-12-17 12:16:54 +08:00
|
|
|
ctx: ctx,
|
2019-05-07 09:12:51 +08:00
|
|
|
doer: doer,
|
|
|
|
repoOwner: repoOwner,
|
|
|
|
repoName: repoName,
|
2022-06-13 17:37:59 +08:00
|
|
|
labels: make(map[string]*issues_model.Label),
|
2022-02-04 03:18:18 +08:00
|
|
|
milestones: make(map[string]int64),
|
2022-06-13 17:37:59 +08:00
|
|
|
issues: make(map[int64]*issues_model.Issue),
|
2019-05-07 09:12:51 +08:00
|
|
|
prHeadCache: make(map[string]struct{}),
|
2019-10-14 14:10:42 +08:00
|
|
|
userMap: make(map[int64]int64),
|
2022-06-13 17:37:59 +08:00
|
|
|
prCache: make(map[int64]*issues_model.PullRequest),
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-07-07 03:24:50 +08:00
|
|
|
// MaxBatchInsertSize returns the table's max batch insert size
|
|
|
|
func (g *GiteaLocalUploader) MaxBatchInsertSize(tp string) int {
|
|
|
|
switch tp {
|
|
|
|
case "issue":
|
2022-06-13 17:37:59 +08:00
|
|
|
return db.MaxBatchInsertSize(new(issues_model.Issue))
|
2019-07-07 03:24:50 +08:00
|
|
|
case "comment":
|
2022-06-13 17:37:59 +08:00
|
|
|
return db.MaxBatchInsertSize(new(issues_model.Comment))
|
2019-07-07 03:24:50 +08:00
|
|
|
case "milestone":
|
2022-04-08 17:11:15 +08:00
|
|
|
return db.MaxBatchInsertSize(new(issues_model.Milestone))
|
2019-07-07 03:24:50 +08:00
|
|
|
case "label":
|
2022-06-13 17:37:59 +08:00
|
|
|
return db.MaxBatchInsertSize(new(issues_model.Label))
|
2019-07-07 03:24:50 +08:00
|
|
|
case "release":
|
2022-08-25 10:31:57 +08:00
|
|
|
return db.MaxBatchInsertSize(new(repo_model.Release))
|
2019-07-07 03:24:50 +08:00
|
|
|
case "pullrequest":
|
2022-06-13 17:37:59 +08:00
|
|
|
return db.MaxBatchInsertSize(new(issues_model.PullRequest))
|
2019-07-07 03:24:50 +08:00
|
|
|
}
|
|
|
|
return 10
|
|
|
|
}
|
|
|
|
|
2020-12-27 11:34:19 +08:00
|
|
|
// CreateRepo creates a repository
|
|
|
|
func (g *GiteaLocalUploader) CreateRepo(repo *base.Repository, opts base.MigrateOptions) error {
|
2022-05-20 22:08:52 +08:00
|
|
|
owner, err := user_model.GetUserByName(g.ctx, g.repoOwner)
|
2020-12-27 11:34:19 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
2019-08-21 04:21:07 +08:00
|
|
|
}
|
|
|
|
|
2021-12-10 09:27:50 +08:00
|
|
|
var r *repo_model.Repository
|
2019-10-13 21:23:14 +08:00
|
|
|
if opts.MigrateToRepoID <= 0 {
|
2022-08-25 10:31:57 +08:00
|
|
|
r, err = repo_module.CreateRepository(g.doer, owner, repo_module.CreateRepoOptions{
|
2020-01-10 23:35:17 +08:00
|
|
|
Name: g.repoName,
|
|
|
|
Description: repo.Description,
|
|
|
|
OriginalURL: repo.OriginalURL,
|
|
|
|
GitServiceType: opts.GitServiceType,
|
|
|
|
IsPrivate: opts.Private,
|
|
|
|
IsMirror: opts.Mirror,
|
2021-12-10 09:27:50 +08:00
|
|
|
Status: repo_model.RepositoryBeingMigrated,
|
2019-10-13 21:23:14 +08:00
|
|
|
})
|
|
|
|
} else {
|
2021-12-10 09:27:50 +08:00
|
|
|
r, err = repo_model.GetRepositoryByID(opts.MigrateToRepoID)
|
2019-10-13 21:23:14 +08:00
|
|
|
}
|
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2020-09-15 22:37:44 +08:00
|
|
|
r.DefaultBranch = repo.DefaultBranch
|
2021-11-18 23:28:10 +08:00
|
|
|
r.Description = repo.Description
|
2019-10-13 21:23:14 +08:00
|
|
|
|
2021-11-08 15:04:13 +08:00
|
|
|
r, err = repo_module.MigrateRepositoryGitData(g.ctx, owner, r, base.MigrateOptions{
|
2019-10-14 14:10:42 +08:00
|
|
|
RepoName: g.repoName,
|
|
|
|
Description: repo.Description,
|
|
|
|
OriginalURL: repo.OriginalURL,
|
|
|
|
GitServiceType: opts.GitServiceType,
|
|
|
|
Mirror: repo.IsMirror,
|
2021-04-09 06:25:57 +08:00
|
|
|
LFS: opts.LFS,
|
|
|
|
LFSEndpoint: opts.LFSEndpoint,
|
2021-01-22 03:33:58 +08:00
|
|
|
CloneAddr: repo.CloneURL,
|
2019-10-14 14:10:42 +08:00
|
|
|
Private: repo.IsPrivate,
|
|
|
|
Wiki: opts.Wiki,
|
|
|
|
Releases: opts.Releases, // if didn't get releases, then sync them from tags
|
2021-01-03 07:47:47 +08:00
|
|
|
MirrorInterval: opts.MirrorInterval,
|
2021-11-20 17:34:05 +08:00
|
|
|
}, NewMigrationHTTPTransport())
|
2019-10-13 21:23:14 +08:00
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
g.sameApp = strings.HasPrefix(repo.OriginalURL, setting.AppURL)
|
2019-05-26 05:18:27 +08:00
|
|
|
g.repo = r
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2022-03-30 03:13:41 +08:00
|
|
|
g.gitRepo, err = git.OpenRepository(g.ctx, r.RepoPath())
|
2019-05-07 09:12:51 +08:00
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2019-11-13 15:01:19 +08:00
|
|
|
// Close closes this uploader
|
|
|
|
func (g *GiteaLocalUploader) Close() {
|
|
|
|
if g.gitRepo != nil {
|
|
|
|
g.gitRepo.Close()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-08-14 14:16:12 +08:00
|
|
|
// CreateTopics creates topics
|
|
|
|
func (g *GiteaLocalUploader) CreateTopics(topics ...string) error {
|
2020-12-27 09:23:57 +08:00
|
|
|
// ignore topics to long for the db
|
|
|
|
c := 0
|
|
|
|
for i := range topics {
|
|
|
|
if len(topics[i]) <= 50 {
|
|
|
|
topics[c] = topics[i]
|
|
|
|
c++
|
|
|
|
}
|
|
|
|
}
|
|
|
|
topics = topics[:c]
|
2021-12-12 23:48:20 +08:00
|
|
|
return repo_model.SaveTopics(g.repo.ID, topics...)
|
2019-08-14 14:16:12 +08:00
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
// CreateMilestones creates milestones
|
|
|
|
func (g *GiteaLocalUploader) CreateMilestones(milestones ...*base.Milestone) error {
|
2022-04-08 17:11:15 +08:00
|
|
|
mss := make([]*issues_model.Milestone, 0, len(milestones))
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, milestone := range milestones {
|
2019-08-15 22:46:21 +08:00
|
|
|
var deadline timeutil.TimeStamp
|
2019-06-29 21:38:22 +08:00
|
|
|
if milestone.Deadline != nil {
|
2019-08-15 22:46:21 +08:00
|
|
|
deadline = timeutil.TimeStamp(milestone.Deadline.Unix())
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
|
|
|
if deadline == 0 {
|
2019-08-15 22:46:21 +08:00
|
|
|
deadline = timeutil.TimeStamp(time.Date(9999, 1, 1, 0, 0, 0, 0, setting.DefaultUILocation).Unix())
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
2021-08-18 08:47:18 +08:00
|
|
|
|
|
|
|
if milestone.Created.IsZero() {
|
|
|
|
if milestone.Updated != nil {
|
|
|
|
milestone.Created = *milestone.Updated
|
|
|
|
} else if milestone.Deadline != nil {
|
|
|
|
milestone.Created = *milestone.Deadline
|
|
|
|
} else {
|
|
|
|
milestone.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if milestone.Updated == nil || milestone.Updated.IsZero() {
|
|
|
|
milestone.Updated = &milestone.Created
|
|
|
|
}
|
|
|
|
|
2022-04-08 17:11:15 +08:00
|
|
|
ms := issues_model.Milestone{
|
2019-06-29 21:38:22 +08:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Name: milestone.Title,
|
|
|
|
Content: milestone.Description,
|
2019-07-29 23:41:22 +08:00
|
|
|
IsClosed: milestone.State == "closed",
|
2021-08-18 08:47:18 +08:00
|
|
|
CreatedUnix: timeutil.TimeStamp(milestone.Created.Unix()),
|
|
|
|
UpdatedUnix: timeutil.TimeStamp(milestone.Updated.Unix()),
|
2019-06-29 21:38:22 +08:00
|
|
|
DeadlineUnix: deadline,
|
|
|
|
}
|
|
|
|
if ms.IsClosed && milestone.Closed != nil {
|
2019-08-15 22:46:21 +08:00
|
|
|
ms.ClosedDateUnix = timeutil.TimeStamp(milestone.Closed.Unix())
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
|
|
|
mss = append(mss, &ms)
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
err := models.InsertMilestones(mss...)
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, ms := range mss {
|
2022-02-04 03:18:18 +08:00
|
|
|
g.milestones[ms.Name] = ms.ID
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
return nil
|
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
// CreateLabels creates labels
|
|
|
|
func (g *GiteaLocalUploader) CreateLabels(labels ...*base.Label) error {
|
2022-06-13 17:37:59 +08:00
|
|
|
lbs := make([]*issues_model.Label, 0, len(labels))
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, label := range labels {
|
2022-06-13 17:37:59 +08:00
|
|
|
lbs = append(lbs, &issues_model.Label{
|
2019-06-29 21:38:22 +08:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Name: label.Name,
|
|
|
|
Description: label.Description,
|
|
|
|
Color: fmt.Sprintf("#%s", label.Color),
|
|
|
|
})
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
err := issues_model.NewLabels(lbs...)
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
2019-06-29 21:38:22 +08:00
|
|
|
return err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, lb := range lbs {
|
2022-02-04 03:18:18 +08:00
|
|
|
g.labels[lb.Name] = lb
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
return nil
|
|
|
|
}
|
2019-05-07 09:12:51 +08:00
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
// CreateReleases creates releases
|
2020-12-27 11:34:19 +08:00
|
|
|
func (g *GiteaLocalUploader) CreateReleases(releases ...*base.Release) error {
|
2022-08-25 10:31:57 +08:00
|
|
|
rels := make([]*repo_model.Release, 0, len(releases))
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, release := range releases {
|
2021-08-18 08:47:18 +08:00
|
|
|
if release.Created.IsZero() {
|
|
|
|
if !release.Published.IsZero() {
|
|
|
|
release.Created = release.Published
|
|
|
|
} else {
|
|
|
|
release.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2022-08-25 10:31:57 +08:00
|
|
|
rel := repo_model.Release{
|
2019-10-14 14:10:42 +08:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
TagName: release.TagName,
|
|
|
|
LowerTagName: strings.ToLower(release.TagName),
|
|
|
|
Target: release.TargetCommitish,
|
|
|
|
Title: release.Name,
|
|
|
|
Note: release.Body,
|
|
|
|
IsDraft: release.Draft,
|
|
|
|
IsPrerelease: release.Prerelease,
|
|
|
|
IsTag: false,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(release.Created.Unix()),
|
|
|
|
}
|
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(release, &rel); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2022-04-27 07:24:06 +08:00
|
|
|
// calc NumCommits if possible
|
|
|
|
if rel.TagName != "" {
|
2021-05-30 04:04:58 +08:00
|
|
|
commit, err := g.gitRepo.GetTagCommit(rel.TagName)
|
2022-07-12 20:57:38 +08:00
|
|
|
if !git.IsErrNotExist(err) {
|
2022-04-27 07:24:06 +08:00
|
|
|
if err != nil {
|
|
|
|
return fmt.Errorf("GetTagCommit[%v]: %v", rel.TagName, err)
|
|
|
|
}
|
|
|
|
rel.Sha1 = commit.ID.String()
|
|
|
|
rel.NumCommits, err = commit.CommitsCount()
|
|
|
|
if err != nil {
|
|
|
|
return fmt.Errorf("CommitsCount: %v", err)
|
|
|
|
}
|
2021-05-16 06:37:17 +08:00
|
|
|
}
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, asset := range release.Assets {
|
2021-08-18 08:47:18 +08:00
|
|
|
if asset.Created.IsZero() {
|
|
|
|
if !asset.Updated.IsZero() {
|
|
|
|
asset.Created = asset.Updated
|
|
|
|
} else {
|
|
|
|
asset.Created = release.Created
|
|
|
|
}
|
|
|
|
}
|
2022-01-21 01:46:10 +08:00
|
|
|
attach := repo_model.Attachment{
|
2020-06-18 17:18:44 +08:00
|
|
|
UUID: gouuid.New().String(),
|
2019-06-29 21:38:22 +08:00
|
|
|
Name: asset.Name,
|
|
|
|
DownloadCount: int64(*asset.DownloadCount),
|
|
|
|
Size: int64(*asset.Size),
|
2019-08-15 22:46:21 +08:00
|
|
|
CreatedUnix: timeutil.TimeStamp(asset.Created.Unix()),
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
// download attachment
|
2021-05-16 06:37:17 +08:00
|
|
|
err := func() error {
|
2020-12-27 11:34:19 +08:00
|
|
|
// asset.DownloadURL maybe a local file
|
2020-10-14 12:06:00 +08:00
|
|
|
var rc io.ReadCloser
|
2021-05-16 06:37:17 +08:00
|
|
|
var err error
|
2021-06-04 21:14:20 +08:00
|
|
|
if asset.DownloadFunc != nil {
|
2020-12-27 11:34:19 +08:00
|
|
|
rc, err = asset.DownloadFunc()
|
2020-10-14 12:06:00 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2021-06-04 21:14:20 +08:00
|
|
|
} else if asset.DownloadURL != nil {
|
2020-12-27 11:34:19 +08:00
|
|
|
rc, err = uri.Open(*asset.DownloadURL)
|
2020-10-14 12:06:00 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2019-11-05 20:54:47 +08:00
|
|
|
}
|
2021-06-04 21:14:20 +08:00
|
|
|
if rc == nil {
|
|
|
|
return nil
|
|
|
|
}
|
2021-04-04 00:19:59 +08:00
|
|
|
_, err = storage.Attachments.Save(attach.RelativePath(), rc, int64(*asset.Size))
|
2021-06-04 21:14:20 +08:00
|
|
|
rc.Close()
|
2019-11-05 20:54:47 +08:00
|
|
|
return err
|
|
|
|
}()
|
|
|
|
if err != nil {
|
2019-06-29 21:38:22 +08:00
|
|
|
return err
|
|
|
|
}
|
2020-12-27 11:34:19 +08:00
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
rel.Attachments = append(rel.Attachments, &attach)
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
rels = append(rels, &rel)
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
2019-07-02 05:17:16 +08:00
|
|
|
|
2019-12-12 08:20:11 +08:00
|
|
|
return models.InsertReleases(rels...)
|
|
|
|
}
|
2019-12-11 14:09:06 +08:00
|
|
|
|
2019-12-12 08:20:11 +08:00
|
|
|
// SyncTags syncs releases with tags in the database
|
|
|
|
func (g *GiteaLocalUploader) SyncTags() error {
|
2021-11-08 15:04:13 +08:00
|
|
|
return repo_module.SyncReleasesWithTags(g.repo, g.gitRepo)
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
// CreateIssues creates issues
|
|
|
|
func (g *GiteaLocalUploader) CreateIssues(issues ...*base.Issue) error {
|
2022-06-13 17:37:59 +08:00
|
|
|
iss := make([]*issues_model.Issue, 0, len(issues))
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, issue := range issues {
|
2022-06-13 17:37:59 +08:00
|
|
|
var labels []*issues_model.Label
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, label := range issue.Labels {
|
2022-02-04 03:18:18 +08:00
|
|
|
lb, ok := g.labels[label.Name]
|
2019-06-29 21:38:22 +08:00
|
|
|
if ok {
|
2022-02-04 03:18:18 +08:00
|
|
|
labels = append(labels, lb)
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2022-02-04 03:18:18 +08:00
|
|
|
milestoneID := g.milestones[issue.Milestone]
|
2019-05-07 09:12:51 +08:00
|
|
|
|
2021-08-18 08:47:18 +08:00
|
|
|
if issue.Created.IsZero() {
|
|
|
|
if issue.Closed != nil {
|
|
|
|
issue.Created = *issue.Closed
|
|
|
|
} else {
|
|
|
|
issue.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if issue.Updated.IsZero() {
|
|
|
|
if issue.Closed != nil {
|
|
|
|
issue.Updated = *issue.Closed
|
|
|
|
} else {
|
|
|
|
issue.Updated = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
is := issues_model.Issue{
|
2019-10-14 14:10:42 +08:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Repo: g.repo,
|
|
|
|
Index: issue.Number,
|
|
|
|
Title: issue.Title,
|
|
|
|
Content: issue.Content,
|
2020-12-13 19:34:11 +08:00
|
|
|
Ref: issue.Ref,
|
2019-10-14 14:10:42 +08:00
|
|
|
IsClosed: issue.State == "closed",
|
|
|
|
IsLocked: issue.IsLocked,
|
|
|
|
MilestoneID: milestoneID,
|
|
|
|
Labels: labels,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(issue.Created.Unix()),
|
2020-01-14 18:29:22 +08:00
|
|
|
UpdatedUnix: timeutil.TimeStamp(issue.Updated.Unix()),
|
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-18 01:08:35 +08:00
|
|
|
ForeignReference: &foreignreference.ForeignReference{
|
|
|
|
LocalIndex: issue.GetLocalIndex(),
|
|
|
|
ForeignIndex: strconv.FormatInt(issue.GetForeignIndex(), 10),
|
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Type: foreignreference.TypeIssue,
|
|
|
|
},
|
2019-10-14 14:10:42 +08:00
|
|
|
}
|
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(issue, &is); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
2019-10-14 14:10:42 +08:00
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
if issue.Closed != nil {
|
2019-08-15 22:46:21 +08:00
|
|
|
is.ClosedUnix = timeutil.TimeStamp(issue.Closed.Unix())
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
2020-01-15 19:14:07 +08:00
|
|
|
// add reactions
|
|
|
|
for _, reaction := range issue.Reactions {
|
2022-03-31 17:20:39 +08:00
|
|
|
res := issues_model.Reaction{
|
2020-01-15 19:14:07 +08:00
|
|
|
Type: reaction.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStampNow(),
|
|
|
|
}
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(reaction, &res); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2020-01-15 19:14:07 +08:00
|
|
|
}
|
|
|
|
is.Reactions = append(is.Reactions, &res)
|
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
iss = append(iss, &is)
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2020-04-18 01:42:57 +08:00
|
|
|
if len(iss) > 0 {
|
|
|
|
if err := models.InsertIssues(iss...); err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
|
|
|
for _, is := range iss {
|
2022-02-04 03:18:18 +08:00
|
|
|
g.issues[is.Index] = is
|
2020-04-18 01:42:57 +08:00
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
2020-04-18 01:42:57 +08:00
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
return nil
|
|
|
|
}
|
|
|
|
|
|
|
|
// CreateComments creates comments of issues
|
|
|
|
func (g *GiteaLocalUploader) CreateComments(comments ...*base.Comment) error {
|
2022-06-13 17:37:59 +08:00
|
|
|
cms := make([]*issues_model.Comment, 0, len(comments))
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, comment := range comments {
|
2022-06-13 17:37:59 +08:00
|
|
|
var issue *issues_model.Issue
|
2022-02-04 03:18:18 +08:00
|
|
|
issue, ok := g.issues[comment.IssueIndex]
|
2021-08-18 08:47:18 +08:00
|
|
|
if !ok {
|
2022-02-21 21:00:05 +08:00
|
|
|
return fmt.Errorf("comment references non existent IssueIndex %d", comment.IssueIndex)
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
|
|
|
|
2021-08-18 08:47:18 +08:00
|
|
|
if comment.Created.IsZero() {
|
|
|
|
comment.Created = time.Unix(int64(issue.CreatedUnix), 0)
|
|
|
|
}
|
|
|
|
if comment.Updated.IsZero() {
|
|
|
|
comment.Updated = comment.Created
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
cm := issues_model.Comment{
|
2021-08-18 08:47:18 +08:00
|
|
|
IssueID: issue.ID,
|
2022-06-13 17:37:59 +08:00
|
|
|
Type: issues_model.CommentTypeComment,
|
2019-10-14 14:10:42 +08:00
|
|
|
Content: comment.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(comment.Created.Unix()),
|
2020-01-14 18:29:22 +08:00
|
|
|
UpdatedUnix: timeutil.TimeStamp(comment.Updated.Unix()),
|
2019-10-14 14:10:42 +08:00
|
|
|
}
|
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(comment, &cm); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2019-10-14 14:10:42 +08:00
|
|
|
}
|
|
|
|
|
2020-01-15 19:14:07 +08:00
|
|
|
// add reactions
|
|
|
|
for _, reaction := range comment.Reactions {
|
2022-03-31 17:20:39 +08:00
|
|
|
res := issues_model.Reaction{
|
2020-01-15 19:14:07 +08:00
|
|
|
Type: reaction.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStampNow(),
|
|
|
|
}
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(reaction, &res); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2020-01-15 19:14:07 +08:00
|
|
|
}
|
|
|
|
cm.Reactions = append(cm.Reactions, &res)
|
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
|
2020-01-15 19:14:07 +08:00
|
|
|
cms = append(cms, &cm)
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
|
|
|
|
2020-04-18 01:42:57 +08:00
|
|
|
if len(cms) == 0 {
|
|
|
|
return nil
|
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
return models.InsertIssueComments(cms)
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
// CreatePullRequests creates pull requests
|
|
|
|
func (g *GiteaLocalUploader) CreatePullRequests(prs ...*base.PullRequest) error {
|
2022-06-13 17:37:59 +08:00
|
|
|
gprs := make([]*issues_model.PullRequest, 0, len(prs))
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, pr := range prs {
|
|
|
|
gpr, err := g.newPullRequest(pr)
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2019-10-14 14:10:42 +08:00
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(pr, gpr.Issue); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2019-10-14 14:10:42 +08:00
|
|
|
}
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
gprs = append(gprs, gpr)
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
if err := models.InsertPullRequests(gprs...); err != nil {
|
|
|
|
return err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
2019-06-29 21:38:22 +08:00
|
|
|
for _, pr := range gprs {
|
2022-02-04 03:18:18 +08:00
|
|
|
g.issues[pr.Issue.Index] = pr.Issue
|
2020-10-28 05:34:56 +08:00
|
|
|
pull.AddToTaskQueue(pr)
|
2019-06-29 21:38:22 +08:00
|
|
|
}
|
|
|
|
return nil
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2022-02-25 17:20:50 +08:00
|
|
|
func (g *GiteaLocalUploader) updateGitForPullRequest(pr *base.PullRequest) (head string, err error) {
|
2019-05-07 09:12:51 +08:00
|
|
|
// download patch file
|
2022-02-25 17:20:50 +08:00
|
|
|
err = func() error {
|
2021-08-22 06:47:45 +08:00
|
|
|
if pr.PatchURL == "" {
|
|
|
|
return nil
|
|
|
|
}
|
2020-12-27 11:34:19 +08:00
|
|
|
// pr.PatchURL maybe a local file
|
|
|
|
ret, err := uri.Open(pr.PatchURL)
|
2019-11-05 20:54:47 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2020-12-27 11:34:19 +08:00
|
|
|
defer ret.Close()
|
2019-11-05 20:54:47 +08:00
|
|
|
pullDir := filepath.Join(g.repo.RepoPath(), "pulls")
|
|
|
|
if err = os.MkdirAll(pullDir, os.ModePerm); err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
f, err := os.Create(filepath.Join(pullDir, fmt.Sprintf("%d.patch", pr.Number)))
|
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
defer f.Close()
|
2020-12-27 11:34:19 +08:00
|
|
|
_, err = io.Copy(f, ret)
|
2019-11-05 20:54:47 +08:00
|
|
|
return err
|
|
|
|
}()
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
2022-02-25 17:20:50 +08:00
|
|
|
return "", err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
// set head information
|
|
|
|
pullHead := filepath.Join(g.repo.RepoPath(), "refs", "pull", fmt.Sprintf("%d", pr.Number))
|
|
|
|
if err := os.MkdirAll(pullHead, os.ModePerm); err != nil {
|
2022-02-25 17:20:50 +08:00
|
|
|
return "", err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
p, err := os.Create(filepath.Join(pullHead, "head"))
|
|
|
|
if err != nil {
|
2022-02-25 17:20:50 +08:00
|
|
|
return "", err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
_, err = p.WriteString(pr.Head.SHA)
|
2019-11-05 20:54:47 +08:00
|
|
|
p.Close()
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
2022-02-25 17:20:50 +08:00
|
|
|
return "", err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2022-02-25 17:20:50 +08:00
|
|
|
head = "unknown repository"
|
2019-07-14 17:16:15 +08:00
|
|
|
if pr.IsForkPullRequest() && pr.State != "closed" {
|
2019-05-07 09:12:51 +08:00
|
|
|
if pr.Head.OwnerName != "" {
|
|
|
|
remote := pr.Head.OwnerName
|
|
|
|
_, ok := g.prHeadCache[remote]
|
|
|
|
if !ok {
|
|
|
|
// git remote add
|
|
|
|
err := g.gitRepo.AddRemote(remote, pr.Head.CloneURL, true)
|
|
|
|
if err != nil {
|
|
|
|
log.Error("AddRemote failed: %s", err)
|
|
|
|
} else {
|
|
|
|
g.prHeadCache[remote] = struct{}{}
|
|
|
|
ok = true
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
if ok {
|
2022-04-25 21:07:08 +08:00
|
|
|
_, _, err = git.NewCommand(g.ctx, "fetch", "--no-tags", "--", remote, pr.Head.Ref).RunStdString(&git.RunOpts{Dir: g.repo.RepoPath()})
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
|
|
|
log.Error("Fetch branch from %s failed: %v", pr.Head.CloneURL, err)
|
|
|
|
} else {
|
|
|
|
headBranch := filepath.Join(g.repo.RepoPath(), "refs", "heads", pr.Head.OwnerName, pr.Head.Ref)
|
|
|
|
if err := os.MkdirAll(filepath.Dir(headBranch), os.ModePerm); err != nil {
|
2022-02-25 17:20:50 +08:00
|
|
|
return "", err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
b, err := os.Create(headBranch)
|
|
|
|
if err != nil {
|
2022-02-25 17:20:50 +08:00
|
|
|
return "", err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
_, err = b.WriteString(pr.Head.SHA)
|
2019-11-05 20:54:47 +08:00
|
|
|
b.Close()
|
2019-05-07 09:12:51 +08:00
|
|
|
if err != nil {
|
2022-02-25 17:20:50 +08:00
|
|
|
return "", err
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
head = pr.Head.OwnerName + "/" + pr.Head.Ref
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
head = pr.Head.Ref
|
2021-10-08 17:59:35 +08:00
|
|
|
// Ensure the closed PR SHA still points to an existing ref
|
2022-04-01 10:55:30 +08:00
|
|
|
_, _, err = git.NewCommand(g.ctx, "rev-list", "--quiet", "-1", pr.Head.SHA).RunStdString(&git.RunOpts{Dir: g.repo.RepoPath()})
|
2021-10-08 17:59:35 +08:00
|
|
|
if err != nil {
|
|
|
|
if pr.Head.SHA != "" {
|
|
|
|
// Git update-ref remove bad references with a relative path
|
|
|
|
log.Warn("Deprecated local head, removing : %v", pr.Head.SHA)
|
2021-12-23 21:44:00 +08:00
|
|
|
err = g.gitRepo.RemoveReference(pr.GetGitRefName())
|
2021-10-08 17:59:35 +08:00
|
|
|
} else {
|
|
|
|
// The SHA is empty, remove the head file
|
|
|
|
log.Warn("Empty reference, removing : %v", pullHead)
|
|
|
|
err = os.Remove(filepath.Join(pullHead, "head"))
|
|
|
|
}
|
|
|
|
if err != nil {
|
|
|
|
log.Error("Cannot remove local head ref, %v", err)
|
|
|
|
}
|
|
|
|
}
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2022-02-25 17:20:50 +08:00
|
|
|
return head, nil
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
func (g *GiteaLocalUploader) newPullRequest(pr *base.PullRequest) (*issues_model.PullRequest, error) {
|
|
|
|
var labels []*issues_model.Label
|
2022-02-25 17:20:50 +08:00
|
|
|
for _, label := range pr.Labels {
|
|
|
|
lb, ok := g.labels[label.Name]
|
|
|
|
if ok {
|
|
|
|
labels = append(labels, lb)
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
milestoneID := g.milestones[pr.Milestone]
|
|
|
|
|
|
|
|
head, err := g.updateGitForPullRequest(pr)
|
|
|
|
if err != nil {
|
|
|
|
return nil, fmt.Errorf("updateGitForPullRequest: %w", err)
|
|
|
|
}
|
|
|
|
|
2021-08-18 08:47:18 +08:00
|
|
|
if pr.Created.IsZero() {
|
|
|
|
if pr.Closed != nil {
|
|
|
|
pr.Created = *pr.Closed
|
|
|
|
} else if pr.MergedTime != nil {
|
|
|
|
pr.Created = *pr.MergedTime
|
|
|
|
} else {
|
|
|
|
pr.Created = time.Now()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if pr.Updated.IsZero() {
|
|
|
|
pr.Updated = pr.Created
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
issue := issues_model.Issue{
|
2019-10-14 14:10:42 +08:00
|
|
|
RepoID: g.repo.ID,
|
|
|
|
Repo: g.repo,
|
|
|
|
Title: pr.Title,
|
|
|
|
Index: pr.Number,
|
|
|
|
Content: pr.Content,
|
|
|
|
MilestoneID: milestoneID,
|
|
|
|
IsPull: true,
|
|
|
|
IsClosed: pr.State == "closed",
|
|
|
|
IsLocked: pr.IsLocked,
|
|
|
|
Labels: labels,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(pr.Created.Unix()),
|
2020-01-14 18:29:22 +08:00
|
|
|
UpdatedUnix: timeutil.TimeStamp(pr.Updated.Unix()),
|
2019-10-14 14:10:42 +08:00
|
|
|
}
|
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(pr, &issue); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return nil, err
|
2019-10-14 14:10:42 +08:00
|
|
|
}
|
|
|
|
|
2020-01-15 19:14:07 +08:00
|
|
|
// add reactions
|
|
|
|
for _, reaction := range pr.Reactions {
|
2022-03-31 17:20:39 +08:00
|
|
|
res := issues_model.Reaction{
|
2020-01-15 19:14:07 +08:00
|
|
|
Type: reaction.Content,
|
|
|
|
CreatedUnix: timeutil.TimeStampNow(),
|
|
|
|
}
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(reaction, &res); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return nil, err
|
2020-01-15 19:14:07 +08:00
|
|
|
}
|
|
|
|
issue.Reactions = append(issue.Reactions, &res)
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
pullRequest := issues_model.PullRequest{
|
2019-10-18 19:13:31 +08:00
|
|
|
HeadRepoID: g.repo.ID,
|
|
|
|
HeadBranch: head,
|
|
|
|
BaseRepoID: g.repo.ID,
|
|
|
|
BaseBranch: pr.Base.Ref,
|
|
|
|
MergeBase: pr.Base.SHA,
|
|
|
|
Index: pr.Number,
|
|
|
|
HasMerged: pr.Merged,
|
2019-05-07 09:12:51 +08:00
|
|
|
|
2019-10-14 14:10:42 +08:00
|
|
|
Issue: &issue,
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
if pullRequest.Issue.IsClosed && pr.Closed != nil {
|
2019-08-15 22:46:21 +08:00
|
|
|
pullRequest.Issue.ClosedUnix = timeutil.TimeStamp(pr.Closed.Unix())
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
if pullRequest.HasMerged && pr.MergedTime != nil {
|
2019-08-15 22:46:21 +08:00
|
|
|
pullRequest.MergedUnix = timeutil.TimeStamp(pr.MergedTime.Unix())
|
2019-05-07 09:12:51 +08:00
|
|
|
pullRequest.MergedCommitID = pr.MergeCommitSHA
|
|
|
|
pullRequest.MergerID = g.doer.ID
|
|
|
|
}
|
|
|
|
|
|
|
|
// TODO: assignees
|
|
|
|
|
2019-06-29 21:38:22 +08:00
|
|
|
return &pullRequest, nil
|
2019-05-07 09:12:51 +08:00
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
func convertReviewState(state string) issues_model.ReviewType {
|
2020-01-24 01:28:15 +08:00
|
|
|
switch state {
|
|
|
|
case base.ReviewStatePending:
|
2022-06-13 17:37:59 +08:00
|
|
|
return issues_model.ReviewTypePending
|
2020-01-24 01:28:15 +08:00
|
|
|
case base.ReviewStateApproved:
|
2022-06-13 17:37:59 +08:00
|
|
|
return issues_model.ReviewTypeApprove
|
2020-01-24 01:28:15 +08:00
|
|
|
case base.ReviewStateChangesRequested:
|
2022-06-13 17:37:59 +08:00
|
|
|
return issues_model.ReviewTypeReject
|
2020-01-24 01:28:15 +08:00
|
|
|
case base.ReviewStateCommented:
|
2022-06-13 17:37:59 +08:00
|
|
|
return issues_model.ReviewTypeComment
|
2022-06-09 10:50:05 +08:00
|
|
|
case base.ReviewStateRequestReview:
|
2022-06-13 17:37:59 +08:00
|
|
|
return issues_model.ReviewTypeRequest
|
2020-01-24 01:28:15 +08:00
|
|
|
default:
|
2022-06-13 17:37:59 +08:00
|
|
|
return issues_model.ReviewTypePending
|
2020-01-24 01:28:15 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2022-02-21 21:00:05 +08:00
|
|
|
// CreateReviews create pull request reviews of currently migrated issues
|
2020-01-24 01:28:15 +08:00
|
|
|
func (g *GiteaLocalUploader) CreateReviews(reviews ...*base.Review) error {
|
2022-06-13 17:37:59 +08:00
|
|
|
cms := make([]*issues_model.Review, 0, len(reviews))
|
2020-01-24 01:28:15 +08:00
|
|
|
for _, review := range reviews {
|
2022-06-13 17:37:59 +08:00
|
|
|
var issue *issues_model.Issue
|
2022-02-04 03:18:18 +08:00
|
|
|
issue, ok := g.issues[review.IssueIndex]
|
2021-08-18 08:47:18 +08:00
|
|
|
if !ok {
|
2022-02-21 21:00:05 +08:00
|
|
|
return fmt.Errorf("review references non existent IssueIndex %d", review.IssueIndex)
|
2020-01-24 01:28:15 +08:00
|
|
|
}
|
2021-08-18 08:47:18 +08:00
|
|
|
if review.CreatedAt.IsZero() {
|
|
|
|
review.CreatedAt = time.Unix(int64(issue.CreatedUnix), 0)
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
cm := issues_model.Review{
|
2020-01-24 01:28:15 +08:00
|
|
|
Type: convertReviewState(review.State),
|
2021-08-18 08:47:18 +08:00
|
|
|
IssueID: issue.ID,
|
2020-01-24 01:28:15 +08:00
|
|
|
Content: review.Content,
|
|
|
|
Official: review.Official,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(review.CreatedAt.Unix()),
|
|
|
|
UpdatedUnix: timeutil.TimeStamp(review.CreatedAt.Unix()),
|
|
|
|
}
|
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(review, &cm); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2020-01-24 01:28:15 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
// get pr
|
2021-08-18 08:47:18 +08:00
|
|
|
pr, ok := g.prCache[issue.ID]
|
2020-01-24 01:28:15 +08:00
|
|
|
if !ok {
|
|
|
|
var err error
|
2022-06-13 17:37:59 +08:00
|
|
|
pr, err = issues_model.GetPullRequestByIssueIDWithNoAttributes(issue.ID)
|
2020-01-24 01:28:15 +08:00
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
2021-08-18 08:47:18 +08:00
|
|
|
g.prCache[issue.ID] = pr
|
2020-01-24 01:28:15 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
for _, comment := range review.Comments {
|
2020-10-14 12:06:00 +08:00
|
|
|
line := comment.Line
|
|
|
|
if line != 0 {
|
|
|
|
comment.Position = 1
|
|
|
|
} else {
|
|
|
|
_, _, line, _ = git.ParseDiffHunkString(comment.DiffHunk)
|
|
|
|
}
|
2020-01-24 01:28:15 +08:00
|
|
|
headCommitID, err := g.gitRepo.GetRefCommitID(pr.GetGitRefName())
|
|
|
|
if err != nil {
|
2021-09-01 19:33:07 +08:00
|
|
|
log.Warn("GetRefCommitID[%s]: %v, the review comment will be ignored", pr.GetGitRefName(), err)
|
|
|
|
continue
|
2020-01-24 01:28:15 +08:00
|
|
|
}
|
2020-01-28 16:02:03 +08:00
|
|
|
|
|
|
|
var patch string
|
2021-02-28 02:46:14 +08:00
|
|
|
reader, writer := io.Pipe()
|
|
|
|
defer func() {
|
|
|
|
_ = reader.Close()
|
|
|
|
_ = writer.Close()
|
|
|
|
}()
|
2022-04-15 22:50:09 +08:00
|
|
|
go func(comment *base.ReviewComment) {
|
2021-02-28 02:46:14 +08:00
|
|
|
if err := git.GetRepoRawDiffForFile(g.gitRepo, pr.MergeBase, headCommitID, git.RawDiffNormal, comment.TreePath, writer); err != nil {
|
|
|
|
// We should ignore the error since the commit maybe removed when force push to the pull request
|
|
|
|
log.Warn("GetRepoRawDiffForFile failed when migrating [%s, %s, %s, %s]: %v", g.gitRepo.Path, pr.MergeBase, headCommitID, comment.TreePath, err)
|
|
|
|
}
|
|
|
|
_ = writer.Close()
|
2022-04-15 22:50:09 +08:00
|
|
|
}(comment)
|
2021-02-28 02:46:14 +08:00
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
patch, _ = git.CutDiffAroundLine(reader, int64((&issues_model.Comment{Line: int64(line + comment.Position - 1)}).UnsignedLine()), line < 0, setting.UI.CodeCommentLines)
|
2020-01-24 01:28:15 +08:00
|
|
|
|
2021-08-18 08:47:18 +08:00
|
|
|
if comment.CreatedAt.IsZero() {
|
|
|
|
comment.CreatedAt = review.CreatedAt
|
|
|
|
}
|
|
|
|
if comment.UpdatedAt.IsZero() {
|
|
|
|
comment.UpdatedAt = comment.CreatedAt
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
c := issues_model.Comment{
|
|
|
|
Type: issues_model.CommentTypeCode,
|
2021-08-18 08:47:18 +08:00
|
|
|
IssueID: issue.ID,
|
2020-01-24 01:28:15 +08:00
|
|
|
Content: comment.Content,
|
|
|
|
Line: int64(line + comment.Position - 1),
|
|
|
|
TreePath: comment.TreePath,
|
|
|
|
CommitSHA: comment.CommitID,
|
|
|
|
Patch: patch,
|
|
|
|
CreatedUnix: timeutil.TimeStamp(comment.CreatedAt.Unix()),
|
|
|
|
UpdatedUnix: timeutil.TimeStamp(comment.UpdatedAt.Unix()),
|
|
|
|
}
|
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
if err := g.remapUser(review, &c); err != nil {
|
2022-02-02 02:20:28 +08:00
|
|
|
return err
|
2020-01-24 01:28:15 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
cm.Comments = append(cm.Comments, &c)
|
|
|
|
}
|
|
|
|
|
|
|
|
cms = append(cms, &cm)
|
|
|
|
}
|
|
|
|
|
2022-06-13 17:37:59 +08:00
|
|
|
return issues_model.InsertReviews(cms)
|
2020-01-24 01:28:15 +08:00
|
|
|
}
|
|
|
|
|
2019-05-07 09:12:51 +08:00
|
|
|
// Rollback when migrating failed, this will rollback all the changes.
|
|
|
|
func (g *GiteaLocalUploader) Rollback() error {
|
|
|
|
if g.repo != nil && g.repo.ID > 0 {
|
2021-05-15 04:19:38 +08:00
|
|
|
g.gitRepo.Close()
|
2019-05-07 09:12:51 +08:00
|
|
|
if err := models.DeleteRepository(g.doer, g.repo.OwnerID, g.repo.ID); err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
}
|
|
|
|
return nil
|
|
|
|
}
|
2020-12-27 11:34:19 +08:00
|
|
|
|
|
|
|
// Finish when migrating success, this will do some status update things.
|
|
|
|
func (g *GiteaLocalUploader) Finish() error {
|
|
|
|
if g.repo == nil || g.repo.ID <= 0 {
|
|
|
|
return ErrRepoNotCreated
|
|
|
|
}
|
|
|
|
|
2021-08-13 21:06:18 +08:00
|
|
|
// update issue_index
|
2022-06-13 17:37:59 +08:00
|
|
|
if err := issues_model.RecalculateIssueIndexForRepo(g.repo.ID); err != nil {
|
2021-08-13 21:06:18 +08:00
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2022-03-22 23:22:54 +08:00
|
|
|
if err := models.UpdateRepoStats(g.ctx, g.repo.ID); err != nil {
|
2022-01-18 02:31:58 +08:00
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
2021-12-10 09:27:50 +08:00
|
|
|
g.repo.Status = repo_model.RepositoryReady
|
2022-05-20 22:08:52 +08:00
|
|
|
return repo_model.UpdateRepositoryCols(g.ctx, g.repo, "status")
|
2020-12-27 11:34:19 +08:00
|
|
|
}
|
2022-02-02 02:20:28 +08:00
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
func (g *GiteaLocalUploader) remapUser(source user_model.ExternalUserMigrated, target user_model.ExternalUserRemappable) error {
|
|
|
|
var userid int64
|
|
|
|
var err error
|
|
|
|
if g.sameApp {
|
|
|
|
userid, err = g.remapLocalUser(source, target)
|
|
|
|
} else {
|
|
|
|
userid, err = g.remapExternalUser(source, target)
|
|
|
|
}
|
|
|
|
|
|
|
|
if err != nil {
|
|
|
|
return err
|
|
|
|
}
|
|
|
|
|
|
|
|
if userid > 0 {
|
|
|
|
return target.RemapExternalUser("", 0, userid)
|
|
|
|
}
|
|
|
|
return target.RemapExternalUser(source.GetExternalName(), source.GetExternalID(), g.doer.ID)
|
|
|
|
}
|
|
|
|
|
|
|
|
func (g *GiteaLocalUploader) remapLocalUser(source user_model.ExternalUserMigrated, target user_model.ExternalUserRemappable) (int64, error) {
|
2022-02-02 02:20:28 +08:00
|
|
|
userid, ok := g.userMap[source.GetExternalID()]
|
2022-02-06 17:05:29 +08:00
|
|
|
if !ok {
|
|
|
|
name, err := user_model.GetUserNameByID(g.ctx, source.GetExternalID())
|
2022-02-02 02:20:28 +08:00
|
|
|
if err != nil {
|
2022-02-06 17:05:29 +08:00
|
|
|
return 0, err
|
2022-02-02 02:20:28 +08:00
|
|
|
}
|
2022-02-06 17:05:29 +08:00
|
|
|
// let's not reuse an ID when the user was deleted or has a different user name
|
|
|
|
if name != source.GetExternalName() {
|
|
|
|
userid = 0
|
|
|
|
} else {
|
|
|
|
userid = source.GetExternalID()
|
2022-02-02 02:20:28 +08:00
|
|
|
}
|
2022-02-06 17:05:29 +08:00
|
|
|
g.userMap[source.GetExternalID()] = userid
|
2022-02-02 02:20:28 +08:00
|
|
|
}
|
2022-02-06 17:05:29 +08:00
|
|
|
return userid, nil
|
|
|
|
}
|
2022-02-02 02:20:28 +08:00
|
|
|
|
2022-02-06 17:05:29 +08:00
|
|
|
func (g *GiteaLocalUploader) remapExternalUser(source user_model.ExternalUserMigrated, target user_model.ExternalUserRemappable) (userid int64, err error) {
|
|
|
|
userid, ok := g.userMap[source.GetExternalID()]
|
|
|
|
if !ok {
|
|
|
|
userid, err = user_model.GetUserIDByExternalUserID(g.gitServiceType.Name(), fmt.Sprintf("%d", source.GetExternalID()))
|
|
|
|
if err != nil {
|
|
|
|
log.Error("GetUserIDByExternalUserID: %v", err)
|
|
|
|
return 0, err
|
|
|
|
}
|
|
|
|
g.userMap[source.GetExternalID()] = userid
|
2022-02-02 02:20:28 +08:00
|
|
|
}
|
2022-02-06 17:05:29 +08:00
|
|
|
return userid, nil
|
2022-02-02 02:20:28 +08:00
|
|
|
}
|