5 DevOps Tips to Help a Beginning Developer

We recently wrote about how four curly braces stopped a major Skyscanner service for 4 hours. In the comments, they noticed that soon the position of “Senior YAML Developer” may stop being a joke.

To the start DevOps course Here are some tips from our Github blog that can help a novice developer avoid mistakes from the Skyscanner team.


DevOps is in great demand today in technology – from CI / CD (continuous integration and software deployment) to containerization management and server provisioning. You could even say it’s a buzzword … by ear. As a developer, you can be part of the DevOps team – you don’t need to provision servers and manage containers, but create great software.

Much of what developers, DevOps engineers, and IT teams do in today’s software development lifecycle are tooling, testing, automation, and server orchestration.

Especially if the team is involved in a large Open Source project or we are talking about one person. Here are five DevOps best practices for developers looking to work smarter and faster.

YAML makes frontend work easier

Introduced in 2001 year, YAML has become one of the languages ​​for many declarative automations – it is often used in DevOps and the development of various front-end configurations, automation, etc. YAML stands for Yet Another Markup Language. The YAML markup is easy to read. It has less emphasis on characters like parentheses, curly braces and quotes. {}, [], "

Why is it important? As you learn or even improve your YAML skills, it is easier for you to save configurations for applications, such as settings in an easy to write and read language.

YAML files are everywhere, from enterprise development workflows to open source projects. Lots of YAML files on GitHub too (they support a product we really like: GitHub Actions, but more on that later).

Wherever you are using YAML – in routine workflows or when using various tools with YAML – it is very useful to get started with this language or hone your YAML skills. Want to learn more about YAML? Try it Learn YAML in Y Minutes guide (“A Guide to Learning YAML in Y Minutes”).

DevOps tools help you accelerate

First, let’s be clear: DevOps tools is a broad concept that encompasses cloud platforms, server orchestration tools, code management, version control, and more. These are all technologies that make it easier to write, test, host, release software, and make any fear of unexpected crashes a thing of the past. Here are three DevOps tools to help you speed up your workflows and focus on building great software.

Git

You probably know that Git is a distributed version control system and source code control tool. For developers, it is the foundation and popular DevOps tool.

Why?

Git simplifies version control, allows for collaboration, experimenting with different branches, and merging new features into the main development branch.

Find out more About, how git works [мы заменили англоязычную ссылку на статью ссылкой на перевод подробной книги о Git. — прим. ред.]

Cloud Integrated Development Environments (IDEs)

I know it’s a bit tricky to say it out loud (thanks, marketing). It will be easier – cloud IDEs. But these platforms are worth starting to learn immediately.

And that’s why. Cloud IDEs are fully hosted development environments that let you write, run, debug code, and quickly deploy new, pre-configured environments. Need confirmation? V at the beginning of the year we launched own cloud IDE Codespaces and started using it to build GitHub… It used to take up to 45 minutes to deploy new developer environments – now it’s only 10 seconds.

Cloud IDEs make it easy and quick to deploy new, pre-configured IDEs, including disposable IDEs. In addition, you don’t have to think about the power of your computer with them (hello to everyone who dared to write code on tablets).

Imagine: the laptop suddenly lights up (this happened to me a couple of times). And there are npm versions, tools for connecting to a cloud provider and a bunch of settings! With the IDE, you deploy the environment in the cloud with all configurations, and it’s just magic.

Find out, how cloud IDEs work

Containers

Don’t want a cloud-based IDE? Use containers – locally or in the cloud. Over the past 10 years, containers have exploded in popularity thanks to their use in microservices architecture, CI / CD, cloud application development, and so on. At their core, containers are lightweight and efficient: they are easy to create, test, build, and deploy software with.

Mastering the basics of containerization is very helpful – especially when testing your code in a lightweight environment that simulates production. With containers it is also very easy to update the library or try the application in the next version before going into production.

This is especially useful for Shift Left, an important DevOps strategy. Identifying bugs or issues before going into production will save you a lot of headaches. Finding these errors while writing code is even better. Any problems in the end add work, so the sooner you find them, the better. Again, identifying the problem before proceeding to compile time will save you a lot of headache.

Find out, how containers work.

Automated Testing and Continuous Integration (CI) Keeps You One Step Ahead

These two concepts are often mentioned in conversations about DevOps. Automated testing, being usually part of CI practice, is not a strict requirement (although it should be … at least part of the continuous software deployment phase).

Many teams limit themselves to basic unit testing as part of the CI process, dispensing with automated UI testing, integration testing, identifying security vulnerabilities, etc.

But how to speed up workflows and reduce the workload on the DevOps team?

1. Make sure the code works with the master branch.

2. Find security vulnerabilities and other problems.

See below for how to do this.

GitHub Actions for Test Automation

A lot is possible with GitHub Actions – for example, to order pizza or activate alert. It all comes down to workflow automation. To set up tests using GitHub Actions, create your own Action or apply a ready-made one from GitHub Marketplace

find outHow to Automate Workflows with GitHub Actions

Professional recommendation: A great way to check for security vulnerabilities and issues in your code before merging to master is by using GitHub Actions workflows, which are triggered in response to pull requests to the project repository. As a result, the main branch is clean, and you are one step ahead.

Learn more about GitHub Actions from of this leadership

You can also configure workflows for deployment in temporary test environments. That is, run tests and deploy changes to the environment and test the application in it. You can even customize your workflow to automatically remove these test environments when finished. All this allows you to test as much as possible before moving to a production environment.

Using GitHub Actions to Create CI Pipelines

CI, or continuous integration, Is the process of automatically integrating the code of several users of the project. Good CI practice allows you to: work faster; make sure the code compiles correctly; merge code changes more efficiently and make sure the code fits in with other users.

The most powerful CI workflows are the ones that validate everything you need every time you add code to the server. Working on GitHub? GitHub Actions does this as well. There are many ready-made CI workflows on the GitHub Marketplace (and you can always create your own), but when introducing CI into the development process, you should take into account the nuances:

  • Run the tests you need. Consider what build, integration, and test automation you need. See what went wrong with past releases and if you could add a test for them in CI.

  • Optimize the time it takes to test your code with the speed of new code running. If you add new code every five minutes (hypothetically), but the tests take 10 minutes to run … that’s not good. It is always best to balance what and when you test with how long it takes, and narrow down the list of tests to a more realistic number, at least for CI builds.

Check out See a tutorial on how to create a CI pipeline with GitHub Actions.

Server orchestration for increased flexibility and speed

If you are building a cloud application, or even just using different servers, virtual machines, containers or hosting services, then you are probably dealing with several environments. Making sure the application and infrastructure fit together reduces your reliance on development teams trying to run software on your infrastructure at the last minute.

It will come in handy here server orchestration… Server or infrastructure orchestration is typically the responsibility of IT and DevOps teams. These include configuring, managing, preparing, and coordinating systems, applications, and underlying infrastructure to run software.

Professional recommendation: there is set of tools, allowing you to define and update the necessary infrastructure.

The big benefit of infrastructure automation is improved scalability. Once environments are defined, it is easier to remove and recreate them when something goes wrong (rather than starting from scratch as was usual).

And another big advantage – if you need to test something, you do not need to ask the development team to configure the server. You will do this as part of a workflow. And without manual tweaking of hardware or system requirements.

How to start: don’t try to replace everything in your environment with automated infrastructure automation. Find a part that’s easy to automate and start with that, then another, then the next.

AND never do not run in a production environment. Start with a test environment. It will work there – go to the staging environment (if it works in it, it will work on the production server).

Try writing repetitive tasks in Bash or PowerShell

Imagine you have a bunch of repetitive tasks running locally and taking too long every week. There is a better, efficient way to work with them – write scripts with Bash or PowerShell.

  • Bash has deep roots in the Unix world. It is the foundation for IT, DevOps teams and many developers.

  • PowerShell younger. Developed by Microsoft and launched in 2006, PowerShell replaced the command shell and early scripting languages ​​for task automation and configuration management in Windows environments.

Today, both Bash and PowerShell are cross-platform (although most Windows users use PowerShell, and most Linux or macOS people use Bash).

Professional recommendation: Bash and PowerShell work differently. Where PowerShell works with objects, Bash passes information as strings. Which one to choose depends a lot on personal preference.

For example, I used Bash and PowerShell to make a script that takes the latest version of the code, creates a new branch, switches to it, sends a draft pull request to GitHub, and then opens VSCode (a subsection in the selected editor) in this thread.

This is a series of small stages, life is much easier. I did it once or twice a week, wrote a script, and you have more time for what is important: writing great code.

Outcomes

There is a big difference between an IT professional, a DevOps engineer, and a developer. But in today’s software development world, many of the basic DevOps practices are becoming available to everyone.

Plus, any developer who masters a few DevOps tricks will find it easier to work independently (and more efficiently) while continuing to focus on what matters most, which is creating great software. And each of us can do it.

DevOps tool collection and other links

You can continue learning DevOps or development in our courses:

Find out the details stock.

Other professions and courses

Data Science and Machine Learning

Python, web development

Mobile development

Java and C #

From the basics to the depth

As well as

Similar Posts

Leave a Reply