Contributing
Please submit bug reports, feature requests, or general feedback to our Bug Tracker on GitLab, or to our Support Email.
Since most of the contributions made to the project are code, from now on this document explains the code contributions process.
Guide for creating issues
GitLab issues are the first step in continuously improving and correcting our products. That is why we have developed this comprehensive guide to provide the necessary steps to create an issue effectively.
Below, we will explain everything you need to know when creating an issue:
Who can create an issue?
As our open-source code, anyone with a GitLab account can create an issue.
Issue title
The title of an issue is essential to clearly and concisely communicating what you want to accomplish. It should be descriptive enough for readers to understand it quickly. The syntax of issue titles is as follows:
[Product] Brief Description of the Problem
[Skims] Check AWS ELB listener on http
[Airs] Update the images of the hacking cycle on the homepage
Type
Issue: To add new features to the product.
Description
Here, you can select a template that best describes the issue. The templates are as follows:
- Bug: Refers to a bug or error affecting a product’s functionality or performance.
- Feature: Add a new functionality in one of our products.
- Onboarding Steps: Create a guide for new developers with internal and external links to support more accurate access to information.
- Request for comments: Propose and discuss new standards, protocols, or features, where comments, suggestions, and observations from the community are invited before finalizing the implementation.
- Skim method: Create a new Skims method.
- Ux prototype: Generates a proposal from a UX prototype.
Assignees
The assignment of the person in charge of executing this issue is given.
Labels
The purpose of these labels is to categorize and organize the issues, providing a quick and visual way to identify the content and status of the elements in the project. The labels allow better management, tracking, and filtering of the elements, facilitating collaboration and decision-making within the development team. So that you know which labels to use, we leave you the following link with their different definitions.
Steps
Code contributions are done using the Merge Requests features on GitLab.
As the Author of the contribution, please read the following steps and apply them in your day to day:
-
Configure your Development Environment.
-
Make sure that your contribution has an associated issue in the bug tracker, or create one.
-
Make sure that you understand the motivation behind the issue, the problem it is trying to solve, its impact, its trade-offs, and that it makes sense to implement it as described.
Issues are not set in stone, make sure you iterate on this as many times as needed, and edit the issue as you go.
-
Make sure you enumerate all of the components that will be impacted by the issue.
-
Make sure the issue has received sufficient feedback from the Code Owners of the components impacted by the issue before starting any implementation.
-
Don’t be afraid to ping the author or the Code Owners for clarification. Excellent developers do excellent requirement analysis.
-
-
Code, and:
-
For each of the issue’s impacted components and their corresponding component page:
- Keep their docs updated.
- Make sure you follow their guidelines.
- Make sure you don’t violate their Public Oaths.
- Keep their architecture updated.
- Add any missing information to their documentation. We want to be able to level up and empower other developers to write code autonomously and with confidence, but we cannot do so without documentation, documentation is important, make yourself replaceable.
-
Make sure that your implementation is sufficiently tested:
- By adding automated tests to the CI/CD.
- By manually testing the functionality.
Feel free to use feature flags if appropriate.
-
Make sure that you update the End User documentation, particularly the Platform section.
-
-
Open a Merge Request, and feel free to ping, assign, or send a direct message with the link to the Code Owners of the issue’s impacted components.
-
Go back to step 3 until the issue is completed.
Review process
We conduct code reviews using the Merge Requests features on GitLab, and discussions should happen in the open, either on the Issue, the Merge Request, or the team-wide communication channel.
Reviewers are selected by the Head of Product and in general, a reviewer reads the code, reads the issue, and then reviews the modified files by the Author.
A reviewer must have the following mindset when performing a review:
-
Transferring knowledge to the author.
This can range from a small code suggestion on how to make the code more maintainable or faster, to suggesting a library, reminding them of the guidelines, suggesting a way to organize the code, or signaling fundamental architecture/bugs/security problems that should be considered with the current approach the author is taking.
There are 8 quality characteristics of good software. Help the author think about all of them.
-
The author probably knows more than the reviewer.
The author is the one in the field, touching the code, and seeing the problem first-hand. Always give the benefit of the doubt and start the discussion with a question, rather than an affirmation that things are wrong. There is a chance the reviewer is not seeing the full picture.
-
Neither the reviewer nor the author has more authority.
We are all Developers.
When proposing something make it sound like a proposal and not like an order. If what a reviewer says has value, the author will probably accept it and apply it right away. If a discussion arises, keep it healthy, constructive, and argument based. Either the author is seeing something the reviewer doesn’t see yet or maybe the reviewer is seeing something the author doesn’t see yet. This “aha” moment unlocks learning, and a safe environment to argue is key to good decision-making.
-
Minor improvements or fixes can come later.
If merging a Pull Request adds more value than closing it, go ahead and merge it. Just take note somewhere so that the author reminds amending it later. Also don’t be too picky, especially about things that are subjective like style, formatting, or those that are too minor to even pay attention to (like a typo in a comment).
A reviewer must check:
- That the contributing steps have been followed, not only in the Merge Request, but also in the associated Issue.
- That the Merge Request adds more value than what it takes. This is subjective, but the 8 quality characteristics of good software are a good starting point.
A reviewer should accept a contribution if it’s been made according to this document.
Code Coverage
Codecov is an invaluable too for evaluating the quality and effectiveness of unit tests in our products. Here’s a detailed guide on how to interpret and utilize the reports it provides:
- Accessing Reports: Codecov coverage reports are available for each Pull Request and can be accessed directly form the GitLab user interface.
- Coverage Interpretation: The coverage report provides an overview of the code being tested by our unit tests. It is represented as a percentage, where 100% means all lines of code are being executed by our tests.
- Identifying Untested Areas: Examine the coverage report to identify code areas not adequately tested. These areas may be critical points requiring further attention in terms of unit testing.
- Coverage Trends: Codecov also offers insights into coverage trends over time. Use this information to evaluate whether our product’s coverage is improving or declining over time.
- Corrective Actions: If areas of code with insufficient coverage are identified, collaborate with the team to implement additional unit tests and improve coverage in those areas.
By effectively using this tool, we can ensure the quality and stability of our code over time. This practice also fosters a test-focused development culture throughout our team.
Integration of Coverage Results in Codecov
In our development workflow, each time tests are run in our continuous integration (CI) environment, artifacts containing coverage results are accumulated. These results are generated by tools such as pytest, jest, and cypress, providing a detailed view of both unit and integration test coverage.
At the end of these process,
a dedicated job uploads these coverage results to Codecov.
This job is presented in each of our products.
For example,
in the case of Integrates,
the job is located at integrates/coverage
.
This job not only uploads coverage results to Codecov but also verifies if the coverage percentage meets a minimum threshold. This threshold is defined in the entrypoint.sh script of the job. If the coverage percentage is below the threshold, the job fails and blocks the merge of changes, ensuring that test coverage does not decrease and always trends to improve.
Additionally, Codecov provides information about coverage deltas, showing changes in code coverage between two consecutive test runs. This allows us to quickly identify which code areas have been affected by recent changes and ensure they are adequately tested.
You can access the coverage reports and corresponding graphs at this link.
This integration ensures that we maintain high standards of test coverage in all our products, contributing to the overall stability and quality of our code.