This repo provides an example of how to run the Teamcity CI Server on Openshift. Repo will attempt to spin up the TeamCity CI Server and show an example to build an image and push it into the internal Openshift Image Registry.Repo also installs the TiDB Operator to serve as the database for the TeamCity Server.This is a general example and if being deployed in production hardening would be required for security.
- See Known Issues Below for Known Issues
Repo is open to contributions especially on more build use cases.
-
Repo leverages Openshift GitOps to perform Application Deployment.Install Openshift Gitops.
-
Repo requires a dynamic default storage class
-
Repo requires elevated privileges to run.
-
Git Clone this repo
-
Change directory into this repo
-
Create infra application via ArgoCD
oc create -f ./deploy-tooling/argocd-deploy/application-infra.yaml
This should create the necessary namespaces, serviceaccount and permission to run example. -
The ServiceAccount being used will require enhanced privileges and the gitops tool might not have those permissions. Create the scc for the teamcity-sa service account.
if using Docker in Docker, we need privileged-scc
oc create -f ./deployables/Infra/base/teamcity-scc-privileged.yaml
else
oc create -f ./deployables/Infra/base/teamcity-scc-anyuid.yaml
-
Create TiDb Application via ArgoCD
oc create -f ./deploy-tooling/argocd-deploy/application-TiDB.yaml
This should create the the TiDB Operator,TiDB MySQl Cluster, create a user called teamcity a database called teamcity and set the root and teamcity user passwords to teamcity. -
Create the TeamCity Server via ArgoCD
oc create -f ./deploy-tooling/argocd-deploy/application-teamcity.yaml
-
Wait for teamcity deployment to be ready
oc rollout status deployment/teamcity-server-instance -n teamcity-server -w
-
Get Teamcity Server URL
oc get route/teamcity-server-instance -n teamcity-server -o jsonpath='{.spec.host}'
-
Open TeamCity URL in Browser, this should show you an empty database page like below,like below.Look through known issues if different.
Select "i'm a server administrator...link" -
Link will bring up a prompt for a Token.Token can be found in the pod deployment log. Look for the deployment instance , the running pod and then open the logs for the pod. The token should be visible in lower end of logs.See Sample belows.
-
Teamcity should start initilizing.
-
Teamcity should show a License Agreement Box. If license is ok accept and continue.
-
Teamcity should display a login page. Select the option to Login as Super User and provide the Token copied from logs above.
-
You should be in the Teamcity HomePage after Login.TeamCity Server should have pre-configured Project, Build and Profiles configured via the Restore of backup Files. Unfortunately credentials are not automatically updated yet, so we need to update them.
Select the Testflask project Page and then Click Edit Settings on Top Right of Page.
You should arrive on the Settings Page.Select the Cloud Profiles option.
In the Cloud Profiles list you should find the pre-configured cloud profile for "Openshift".Select the edit button for that cloud profile.
The edit cloud profile should look like below
We need to replace our token, get the teamcity-sa token
TOKEN=oc exec $(oc get pods -n teamcity-server -l app=teamcity-server-instance -o name) -n teamcity-server -- oc whoami -t
Copy value from above token and replace in token field. Test Connection to make sure it works and choose the option to save.
Scroll down to the Agent Images on the same page and select the edit option.Make sure the Deployment Option is set to teamcity-agent-instance which we created earlier and click save this should spin up an agent for us.If it does not go to the agents link(top page) and spin one up.
With the agent running we can go back to our testFlask project page and select run on the build.
This should start the run and we can go the the Build logs to see the running build.At the end of the build a new image should be pushed to our testFlask imagestream. You can confirm by running
oc get is/openshift-teamcity-testflask -n teamcity-server
it should have a recent update.
-
ArgoCD Status for TiDB Operator is wrong so TiDB Application shows up as degraded. Working on a Fix.
-
If the Docker agent shows a "docker.server.version exists" it means the teamcity-sa does not have the necessary permissions.
-
if build fails on "FROM image-registry.openshift-image-registry.svc:5000/openshift/ubi8" with a name unknown error. it means the ubi8 image is not added to your cluster run:
oc tag --source=docker registry.redhat.io/ubi8/ubi:latest ubi8:latest -n openshift