The Continuous Integration system will be responsible for the automation of the following tasks:
Overall build of the Kaltura platform packages against the master branch, release branches and approved pull-requests (for both nightly and stable releases).
Pushing the packages to the install repositories.
Perform a full Kaltura deployment on a test cluster.
Perform automated testing of the installed server features via API calls and command line scripts, determining overall build stability for both clean install and version to version upgrades.
Generate web-page build reports and email in case of fails.
Distribute packaged/compiled API client libraries to respective language repositories.
Why we need a CI system?
An automatic approach to the build, test and release process has many advantages, most prominently, it significantly reduces the time to release new packages and verifies that packages were fully tested before being used in production.
We've also listed some key advantages in our specific project -
Release more often, faster, and provide nightly builds for advanced platform testers.
Ensure new commits do not break existing functionality on the master branch.
Allow contributors to make changes and additions with a higher level of security, knowing pull-requests are tested as part of the whole system in production mode, before being merged.
Provide elaborate platform test reports before entering official manual QA phase.
Bootstrapping
The CI makes use of answer files to achieve an unattended deployment.
Edit the templates to reflect your settings and place the file under /etc/kalt.ans on all machines you plan to use for running the CI. Bear in mind that this file contains sensitive info and should therefore only be readable and writable by super users.
Create $BASE_DIR/rpm_cluster_members and $BASE_DIR/deb_cluster_members. These should contain a list of hosts, separated by newlines, like so:
my.machine0
my.machine1
my.machine2
Use $BASE_DIR/csi.sql to create an SQLite3 DB under $BASE_DIR/db/csi.db.
Edit $BASE_DIR/run_sanity.sh and set the needed params.
Generating and testing API clients
If you wish to generate and test the API clients as part of the CI process:
In $BASE_DIR/run_sanity.sh, set TEST_CLIENTS to 'Y' and set CSI_CLIENT_GENERATING_HOST to the host you wish to generate the clients from.
Edit clientlib_to_git_repo.rc so that it lists the GitHub repos you set up for hosting the generated clients.
NOTE: The script expects these repos to be hosted on GitHub and at Kaltura, we use Travis CI for the actual testing.
That said, if you intend to use a different source control [or just Git but not GitHub] or a different CI service, this can be achieved by making minor changes to $BASE_DIR/clientlibs_test.sh.
Web interface
A simple web I/F for showing the CI results is included in this repo [see index.php].
This can be placed on the docroot of any web server that supports PHP 5.3 and above.
Note that index.php requires the PHP SQLite3 extension to be enabled.
The Test Suites
All API calls and apps will be loaded over SSL.
The following test cases will be run in the following order, on each cluster deployment (both clean and upgrade).
Nightly testing should run a complete regression coverage via API client libs, verifying the stability of the latest MASTER branch.
Check space on / partition
Check space on web partition
Check KDP3 version is correct by comparing KMC's config.ini with the actual last created KDP3 dir on disk.
Check KMC version is correct by comparing KMC's config.ini with the actual last created KDP3 dir on disk.
Check KDP3 version is correct by comparing KMC's config.ini with the actual last created HTML5 dir on disk.
Verify that all relevant processes (Apache, MySQL, Sphinx, batch, memcache, monit) are up and running on all machines in the cluster
Verify that all processes and crons are properly configured to run after system restart
Verify HTTPs call redirects for start page, KMC, Admin Console and testme. Perform curl request (with redirect follow) to each of the URLs, and test the response returned as expected:
https://[DOMAIN]/apps/studio --- Verify Universal Studio URL
Verify system restart behaviour (run 1 through 3 post restart)
Verify that processes (Apache, MySQL, Sphinx, batch, memcache) are being relaunched by monit after MANUAL kill (testing crash resurrection).
Verify new publisher account creation. Continue all following tests on this new partner account.
Verify profile ID creation.
Verify profile ID delete.
Test email logs for sent new publisher account activation email.
uiConf and file verifications -
Run through all the uiConfs in the database.
For each uiConf, run through the uiConf object URLs AND inside the uiConf XML for all referenced file paths (swf, js, image files, etc.) and verify the existence of these files on disk.
Check the kmc.swf and login.swf requests return 200
Verify YouTube distribution (create profile, distribute entry, query for success)
Check the bandwidth report API to see bandwidth and storage counts
Verify KS Access Control:
Create an AC Profile with KS protection
Assign it to a Video Entry
Curl playManifest to that Entry without a KS, see that the video fails to return
Create a Local XML DropFolder, copy a file to the folder, test the file was successfully pulled in to Kaltura, transcoded and that the XML metadata exists.
Create a Remote Storage profile against an S3 Bucket, verify that content uploaded gets pushed to the S3 bucket.
Verify Player - Use http://phantomjs.org/ to run base tests against player embed, playlist embed, thumbnail embed, and common player scenarios (play, pause, seek)
The Reports
Execution Time Benchmarks
For each step in the CI cycle, execution time measurements will be performed and saved in order to analyze platform trends over time. The following CI steps will measured:
Time it took to pull the code from git repositories.
Time it took to build packages.
Time it took to push packages to install repositories.
Time it took to install each package on the test clusters (clean and upgrade).
Time it took to run post-inst scripts per package.
Time to run each unit-test.
Aggregate time from pulling code till finish tests (complete cycle).
Web Reports
Full test report available on a URL with the version-date combo.
Header should show overall health status of the build - Percentage of fail/pass
Per test, status (FAILED or PASSED), and if failed, show unit test error output. According to the following table:
Test File
Status
Execution Time
Details
test_xxx
PASSED
12ms
test_yyy
PASSED
100ms
test_zzz
FAILED
876912ms
Output of what failed in the test
Email Reports
Setup a mailing list for people to subscribe for reports via email. 3 types of emails:
All code in this project is released under the AGPLv3 license unless a different license for a particular library is specified in the applicable library path.
请发表评论