© Joseph Coburn 2020
J. CoburnBuild Your Own Car Dashboard with a Raspberry Pihttps://doi.org/10.1007/978-1-4842-6080-7_4

4. Local Development Configuration

Chapter goal: Configure your computer to write Python and develop the application. Set up a local development environment on your computer. Configure a Git repository to store your code, and learn how to run both your code and unit tests on your computer.
Joseph Coburn1 
(1)
Alford, UK
 

Throughout this book, you’ve mainly learned theoretical computer science, case studies, and best practices, along with an introduction to the Pi and an overview of the software and hardware needed to follow along with these projects. You should now know the “why,” but not the “how.” This practical chapter covers the first steps needed to begin building the project.

A local development environment is the first and the most important tool in your software developer’s toolkit. This lets you write code, write unit tests, safely store and version your code, and rapidly make changes with instant feedback. By running the project on your computer, you can quickly iterate over new designs and features and experiment with code. You don’t even need a Raspberry Pi at this stage.

Note

Running code on your computer instead of the Pi itself provides one major benefit: speed! It’s far quicker and easier to run your code locally instead of remotely. With a build pipeline hanging off your Git repo, it could take several minutes to deploy code to the Pi. This doesn’t sound like a lot, but it gets tedious very quickly, especially if you change something simple like a comma or a bracket. Running code on your computer lets you instantly see the changes – it’s as fast as pressing save.

There is a time and a place for running and testing code on the Pi itself, as your computer is not a Pi and behaves very differently in terms of supported hardware, processing speed, and CPU instruction set. Later on, you’ll learn when and why to deploy to your Pi.

Throughout this book, the instructions given are specifically for macOS Unix and the Raspberry Pi’s Debian-based Raspbian operating systems. Linux users should have no problem running the commands on a different operating system – in some cases, they are the same commands. Microsoft Windows users may struggle to complete some of the console-heavy instructions.

Despite its popularity with consumers, writing code on a Windows machine is more difficult than it should be, and the vast majority of Windows development machines are used for working with Microsoft technologies. If you’re attempting this project on a Windows computer, pay special attention to the instructions. Once you know what you need to do, you may need to figure out how to achieve this in Windows. Often, a simple console command on Mac or Linux requires a GUI installer on Windows.

Getting Started with Python

Almost any current version of Python will work with these projects (although you may have to perform your own troubleshooting and investigation if certain features are not available in older releases of Python). I don’t recommend you use Python version 2 releases, as discussed in Chapter 2. Many new Python modules will simply not work with old versions of Python, and it is officially deprecated – meaning you shouldn’t begin new projects with it.

Other languages could work with this project. Perhaps PHP, Java, or Go would work, but at the very least, you’d need to know how to develop in those languages, and much of the guidance in the following chapters would be rendered useless by choosing a language other than Python. You’d also need to learn how to interface with the sensors, and how to read the camera stream and various other components for the projects. The Official Pi APACHE server tutorial (www.raspberrypi.org/documentation/remote-access/web-server/apache.md) may be a good starting point for one of these alternative languages.

Don’t forget that the vast majority of Pi projects are implemented in Python. There exists a vast range of articles, papers, Tweets, and tutorials around Python on the Pi, which is significantly reduced when working in other languages.

Python is quick to learn, easy to read, and uses standard English keywords. Its popularity is going from strength to strength right now, and the Pi is perfectly paired to work with Python as the go-to language.

In order to run any Python scripts, you need to install a Python interpreter. This reads your Python code and translates it into instructions your computer can understand. This happens in real time, and a Python interpreter is often simply called Python. When you run code with Python version 3.7 (www.python.org/downloads/release/python-370/), you are simply using the Python interpreter, version 3.7.

macOS comes with a Python interpreter preinstalled, but it’s not a good practice to use this. This Python is required by some tools used by your operating system. It can change without warning and is usually an old version such as Python 2.7. There’s also the problem of dependencies. Libraries you import into your code are called modules – your code is dependent on these. If these modules are not part of the Python core library, then you need to install them. If you have two projects both using your system Python, then it’s common for both projects to install modules and compete with each other. Perhaps the first project needs version one of a module, but another project needs version two. This is difficult to manage and frustrating to work with.

Solving this dependency problem is simple enough – in theory. You can install specific versions of Python, and install your modules into a virtual environment, which uses any version of Python you have installed. Virtual environments keep all your modules isolated from other projects. Everything installs into a self-contained place. You can create as many as you like – one for each project is sufficient. This ensures that changes to modules in project A don’t impact project B.

This project uses Python 3.7, so get started by installing this version. Open a new Terminal either from your applications folder or by going to Spotlight SearchTerminal.

You could immediately install Python 3.7, but there’s a better way to manage Python versions. A tool called Pyenv lets you install, uninstall, manage, and switch Python versions on a per-project level. It’s easy to use and takes away a lot of the hard work of managing multiple versions of python.

Pyenv is a free tool available on GitHub at https://github.com/pyenv/pyenv. A Windows fork is available at https://github.com/pyenv-win/pyenv-win. Pyenv provides detailed installation instructions, but installation through a tool called Homebrew is far simpler and is a common approach for dev tools on macOS.

Homebrew is a package manager for Mac and Linux. Available at https://brew.sh/, Homebrew lets you install Mac packages (such as pyenv) from your command line. It’s free, and marketed as “The Missing Package Manager for macOS (or Linux).” Different packages are defined as formulas, and you can see the full list at https://formulae.brew.sh/formula/.

Back in your terminal, install Homebrew with the following command:
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

This uses Ruby (included with macOS) to download a build script from GitHub and then run it. This will install Homebrew on your computer, but before running, it lists what it will install and what directories will be created. Click Return to continue the installation. Various words, modules, and formulas will fill up your terminal, but after a few short minutes, you should see “Installation successful!” if everything went to plan. If your installation failed, then take a look at the Homebrew troubleshooting tips available at https://docs.brew.sh/Common-Issues.

Go ahead and use Homebrew to install pyenv:
brew install pyenv

Homebrew will update you on the status of this installation. Often it begins by downloading the code for the latest package version and then installing it, which Homebrew calls “pouring.”

Once installed, pyenv needs a configuration. This is done through the pyenv init command. This ensures pyenv works properly when using any other Python commands. Do this by adding the command to your ~/.bash_profile, which this command will do for you:
echo -e 'if command -v pyenv 1>/dev/null 2>&1; then   eval "$(pyenv init -)" fi' >> ~/.bash_profile
Now restart your shell to pick up this change:
exec "$SHELL"
Note

The .bash_profile is a configuration file for your bash shell (Mac Terminal). When you start a new bash session, the shell looks at this config file. It’s used to configure packages and export variables. Depending on your operating system, this file works and is named slightly differently. macOS Catalina uses ZSH (.zshrc config file) as the default interactive shell, and Linux systems may use additional files, such as .bashrc. Check the pyenv installation guide at https://github.com/pyenv/pyenv for detailed instructions across many operating systems.

The intricacies of how pyenv works are complex, and well explained in the pyenv GitHub repo. In short, whenever your Python applications run, they specify the version of Python they need. Before this command reaches your operating system’s Python interpreter, pyenv jumps in and redirects the command to the version of Python managed by pyenv. It basically intercepts Python commands and redirects them to the correct version of Python.

The projects contained in this book use Python version 3.7. You can install Python with pyenv. Pyenv has access to many different versions of Python, so run this command to list the available versions of Python 3:
pyenv install --list | grep " 3.[7]"
This is really two commands in one. The first half (pyenv install --list) lists all the versions of Python available to install with pyenv. As there are hundreds of versions, that’s a bit excessive. The pipe symbol (|) instructs the shell to run another command after pyenv has finished, using the output of pyenv as the input to the next command. You can chain together as many commands as you like this way. Finally, grep filters down the list to only the Python 3.7 versions (grep “ 3.[7]”). You can see the output of this command in Figure 4-1.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig1_HTML.jpg
Figure 4-1

All the Python 3.7 versions listed by Pyenv

Any version of Python 3.7 is suitable for these projects, but as a best security practice, you should generally aim to install the latest stable version unless it contains breaking changes. Python version 3.7.6 is the version used by these projects (but newer versions should work just fine). Install this Python interpreter with pyenv install (sample output shown in Figure 4-2):
pyenv install 3.7.6
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig2_HTML.jpg
Figure 4-2

Console output during the installation of Python version 3.7.6

Just like the previous commands, pyenv lists the various steps in this process as it goes. Once installed, you can list all the available Python versions with pyenv versions (shown in Figure 4-3):
pyenv versions
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig3_HTML.jpg
Figure 4-3

Installed Python versions listed by Pyenv

Here you’ll see any versions of Python you’ve installed with pyenv, alongside your system Python. The asterisk indicates your default Python. Despite installing Python 3.7.6, this is still your system Python. Run a Python shell to get the version and see (shown in Figure 4-4):
python --version
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig4_HTML.jpg
Figure 4-4

Console output of the current system Python version

To use the newly installed Python 3.7.6, you need to instruct pyenv to configure 3.7.6 as the default Python. You can do this with the global option:
pyenv global 3.7.6

Now when you get the Python version from a Python shell, it should match the version you installed. Python 3.7.6 is now installed, and you can move on to the next step.

Breaking changes are those which are not backward compatible with previous software versions. If you’re upgrading to a breaking version of a package, your code may not work properly, and you may need to make changes to ensure it functions the same as it did with the previous version.

Developers try to ensure changes don’t cause major issues, and you can often upgrade packages without issue. In the case of breaking changes, sometimes it’s just not possible to rework the code in a way that doesn’t cause an issue for someone. A good example of a breaking change is the migration from Python 2 to Python 3.

A Python package called Pipenv ( https://github.com/pypa/pipenv) is needed to handle your virtual environment. This solves many module import issues and ensures any modules you use in this project won’t interfere with other projects – either current or future.

Install Pipenv with the pip install command:
pip install pipenv
Pip is the default package installer for Python. In most instances, it is installed alongside Python. This command instructs Pip to install Pipenv. If you receive a warning about your version of Pip, you can install the latest version with the upgrade command:
pip install --upgrade pip

Pip will install Pipenv into your global system Python. This seems at odds with my recent advice to always install modules into a virtual environment and never mess with system Python. In this case, Pipenv is the virtual environment manager, which makes it tricky to install in its own virtual environment. As Pipenv is the basis of virtual environments across many Python projects on your computer, it receives a pardon and goes into your global Python. Once installed, all future modules can install into their own, isolated, virtual environments.

You’ll revisit Pipenv later on in this chapter, so for now, all you need to do is install it.

Git and GitHub Repository Configuration

Git is one of if not the most popular version control systems in use today, and for good reason. Creating manual revisions of files may work for graphic designers or writers, but the vast number of changes made by software developers on a daily basis mandates the need for a robust code storage solution.

Git lets you store, share, collaborate, restore, undo, experiment, and back up your code easily, and it’s free. GitHub exists as a web platform and remote Git host. While you may not need to share code with other people, free, secure, online code hosting is a wonderful tool to have at your disposal. Other tools exist, but GitHub is free to get started with and is very popular with the open source community.

The official Raspberry Pi Git tutorial may assist you here, available at https://projects.raspberrypi.org/en/projects/getting-started-with-git.

Get started by visiting https://github.com and creating a free account if you don’t already have one. Enter a username, an email address, and a password right from the home screen (illustrated in Figure 4-5). Follow the onscreen steps to confirm your email address by clicking a confirmation link sent to you.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig5_HTML.jpg
Figure 4-5

GitHub welcome screen

Once registered, you’ll see the main GitHub dashboard (Figure 4-6). To start this project, you need a repository, which is a place to store related code. Generally, each individual project starts with a new repository. When developing code based on another’s work, it’s common to “fork” a repository. This copies all the code in a repo and makes a new independent repo. Feel free to clone this project’s repository from https://github.com/CoburnJoe/Pi-Car, but for the purposes of education, you’ll re-create this entire code base from scratch in your own brand-new repo.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig6_HTML.jpg
Figure 4-6

Main GitHub dashboard

From the left of the GitHub dashboard is a Repositories menu. Choose the green New button from this menu. You need to complete some basic information about your new repository (shown in Figure 4-7). These are as follows:
  • Repository name – This is unique to your account.

  • Optional description – A brief overview of your project.

  • Visibility – Public or private. With public access, anyone can see and potentially contribute to your project. If you’re not ready to share your code with the world just yet, then choose private. Only you and people you invite can see your private project.

  • README initialization – A Git README file is where you can write a quick guide to using this code, and it’s a good practice to have one.

  • .gitignore – .gitignore files tell Git not to include certain files or folders when working on a project. Select the Python option here for a boilerplate file.

../images/488914_1_En_4_Chapter/488914_1_En_4_Fig7_HTML.jpg
Figure 4-7

GitHub new repository screen

When you’re ready, choose the Create repository button at the bottom of the page. You’re almost ready to begin coding. The final step needed is to get your new code and associated Git files onto your computer.

Note

This Git configuration is all performed over the command line. If you’re not comfortable with this, then various GUI Git tools exist to aid you. I’d recommend learning the basics of Git first so you can understand how these tools work instead of downloading and blindly pressing buttons in one. If you’re still set on a Git GUI, then GitHub Desktop (https://desktop.github.com/) or Sourcetree (www.sourcetreeapp.com/) are both excellent choices, available for both Mac or Windows operating systems.

After creating your repository (or selecting it from the repository menu on the main GitHub dashboard), you’ll see the main repository overview screen, shown in Figure 4-8. Choose Clone or download from the top right and copy the link to the .git file of your repository. This lets Git know where to find your project. For my repository, this looks like this:
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig8_HTML.jpg
Figure 4-8

The GitHub repository landing page

https://github.com/CoburnJoe/Pi-Car.git

Your repository link will contain your username and repository name. Pointing your computer to this repository is simple enough, but you need Git itself installed for this to work. Open a new Terminal and install Git with Homebrew:
brew install git
After installation, verify it is working with the Git version command :
git --version
This will spit out your current Git version. Finally, configure your local Git name and email address. Git and GitHub will work without this, but it helps Git associate your account with you:
git config --global user.name "Your Name"
git config --global user.email "The email address used with your GitHub account"

You only need to do this configuration once after installing Git. Git retains this information across all projects on your computer.

Jumping back to the repository link you previously copied, download this using the Git clone command , followed by the repository link:
git clone https://github.com/CoburnJoe/Pi-Car.git
This will copy all the code from your repository onto your computer, and the sample output is shown in Figure 4-9. It will link up Git to your repo, so you can push and pull code and perform other Git associated commands. You can now begin writing code.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig9_HTML.jpg
Figure 4-9

Sample console output when Git cloning a repository

Integrated Development Environment Setup

Now you have a way to run Python, a place for your code to live, and a way to track changes, you need a way to write code. You may be familiar with command-line editing tools such as Vi, Vim, or Nano, but an integrated development environment (IDE) provides a wealth of useful tooling and features not possible in (some) other editors.

Subtly different to text editors such as Notepad++ or Sublime Text, IDEs let you
  • Compile code without leaving the program

  • Run unit tests and measure code coverage

  • Debug code and halt execution at arbitrary points in time

In addition to this, IDEs often provide helpful warnings and information about your code as you write it. They point out where you may have made a typo or otherwise named something wrong. They can analyze your code and pull out docstrings, parameter names, and best practices. IDEs shouldn’t be shied away from as they can greatly improve your efficiency as a developer.

My personal preference for a Python IDE is PyCharm, by JetBrains. One reason for this preference is familiarity. I know the tool well – don’t underestimate the performance increase you can achieve by using tools, languages, and frameworks you are familiar with (providing they are suitable for the task at hand). PyCharm has built-in version control and conflict management tools, and a wealth of extensive tools, features, and extensions available. JetBrains (the developers of PyCharm) have a wealth of experience behind them, and continue to expand their software with useful tools, features, and bug fixes. In addition to this, there is large online PyCharm community, where you can find help to any issues you encounter. The PyCharm learning center is a good place to start (www.jetbrains.com/pycharm/learning-center/).

Available for free at www.jetbrains.com/pycharm/, PyCharm has a wealth of tools built-in to make your life easier as a developer and to save you time when writing Python. A professional version is available, but for all the projects in this book, the free community edition is sufficient.

Head over to the JetBrains website and download the latest version of PyCharm community edition. As of writing, this is version 2019.3.3, released in February 2020 (see Figure 4-10). JetBrains will ask for your email address, but your download will start shortly after with no requirement to provide it.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig10_HTML.jpg
Figure 4-10

PyCharm download screen

If you have a preferred Python IDE, there’s no reason you can’t use it to follow along with this book. Most of the examples given are executed through macOS X terminal commands, with only a small section geared toward PyCharm-specific steps.

Once downloaded, click the Mac disk image (ending in .DMG). This will mount the image onto your desktop and bring up a finder window (Figure 4-11). Use this window to drag PyCharm into your Applications folder, thereby installing it. If you like to keep a clean computer, then unmount the volume afterward by dragging it to the trash.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig11_HTML.jpg
Figure 4-11

PyCharm Mac installation

Open PyCharm, and click through their first installation steps, as shown in Figure 4-12. Accept the license agreement and use the default themes and shortcut keys (or customize how you like to work). When you get to the Welcome to PyCharm menu, choose Open and navigate to the local copy of your code which you cloned from your GitHub repository. This tells PyCharm that everything in this folder is your project. When opening a project for the first time, PyCharm needs some time to update its indexes, which you can see it doing in Figure 4-13.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig12_HTML.jpg
Figure 4-12

PyCharm first time landing page

../images/488914_1_En_4_Chapter/488914_1_En_4_Fig13_HTML.jpg
Figure 4-13

PyCharm updating its indexes

PyCharm consists of four main areas. You’ll learn more about specific parts as you progress through the projects, but to start with, these are the following:
  1. 1.

    The Project is on the left.

     
  2. 2.

    The Navigation Bar is at the top, with buttons on the right.

     
  3. 3.

    The Tool Window Bar is at the bottom.

     
  4. 4.

    The Main code editor is on the right.

     
These four key areas are shown along with the main PyCharm screen in Figure 4-14.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig14_HTML.jpg
Figure 4-14

The main PyCharm project view

The project area lists details about your project. It lists all of your files (including hidden files used by Git). The navigation bar shows you where the current file lives in relation to the whole project. It also provides options to run your unit tests and modify your Python configuration. The tool window bar at the bottom provides lots of useful tools. A terminal is here, along with a Python console, version control options, and many more features. Finally, the main code editor is where you can read and write your code. Double-click a file to see it here. You can work with multiple files, split this into two or more views, and more.

You’re now ready to begin writing code. By default, PyCharm will automatically save any changes for you in the background. This only happens when PyCharm loses focus – often when you click another application such as your web browser or another PyCharm tab.

Python Virtual Environment Configuration

If you look at the bottom right of PyCharm, it says Python 2.7. This isn’t correct, as you’re using Python 3.7 for this project. To fix this, you need to use Pipenv to create a new virtual environment, and then point PyCharm at that. Start with a new terminal session – either through the Mac Terminal application or by pressing Terminal in the Tool Window Bar at the bottom of PyCharm. Create a virtual environment with the pipenv install command:
pipenv install

This will create a new virtual environment for you using your global Python version, which is configured as Python 3.7 as per the previous instructions on Pyenv. Pipenv provides a status bar on this progress, but as you have no other modules to install yet, it shouldn’t take very long.

Back in your Project pane, you’ll notice two new files. The Pipfile.lock lists all the packages your project needs, and the Pipfile does the same but in a more human-readable way. The Pipfile looks something like this:
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
[packages]
[requires]
python_version = "3.7"

Let’s break down these four sections. The source specifies where to look to find any Python packages you need. For the vast majority of projects, this is always PyPI, which is the de facto Python module repository. If you need to install modules not found on PyPI, you’ll need to specify a different source here. The dev-packages area lists any packages and versions you need to write your code, but which you don’t need in the final build. This is often linting tools, or a unit testing module.

The packages section lists any modules and versions your code needs to work. Finally, the requires section lists any other dependencies. Your Python version 3.7 is listed here, which outlines what version of Python is needed to run this project.

This file will grow as you build the projects and import more modules (other than those modules found in the Python core code base). This file lists the modules you need, but not necessarily specific versions. Installing modules listed as “*” will install the latest build and associated dependencies.

The Pipfile.lock contains mostly the same information, but it serves a different purpose. This is locked or pinned to specific versions. It’s used for systems, pipelines, and automated tooling to read and install modules from. This won’t ever install a newer version; it’s always locked to a specific version listed when the file was last generated. It also contains hashes of modules, which allows pipenv to verify the contents are exactly the same as those it expects. This again prevents accidental upgrade of packages and protects against a polluted remote source, which may serve up poisoned or otherwise “tampered with” modules.

Let’s install some basic modules and see how this file changes. Back in the terminal, install Flask, Pytest, and Black:
pipenv install flask
pipenv install pytest --dev
pipenv install black --pre --dev

Flask is the Python microframework which is the basis of this whole project. Pytest is a very popular unit testing framework. Notice how the --dev flag is used. This tells Pipenv to install this package under the dev-packages section of the pipfile. Finally, the Black package is an uncompromising code formatter. It reformats your code to meet the Python recommended styles outlined by the Python community. It’s listed as an “uncompromising code formatter.” It handles the formatting of your code, so you can think about the logic and the stuff that really matters. Black is still in prerelease, so the --pre flag ensures pipenv knows you really do want to install it. Using a tool such as Black is a great habit to get into, especially if you come to work with multiple developers on the same code base. You’ll no longer argue over trivial semantic issues such as placement of commas, which quotes you used, or how long lines are. It sounds trivial, but everyone has a preferred style of code formatting. By using Black, it makes all the code consistent. Don’t overlook how important this is.

Back in your pipfile , you’ll see it has updated to reflect your new modules:
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
pytest = "*"
black = "*"
[packages]
flask = "*"
[requires]
python_version = "3.7"
[pipenv]
allow_prereleases = true
One final step needed in the terminal is to regenerate the pipfile.lock . Pipenv won’t always do this for you. It doesn’t want to potentially break anything upstream of your local code by inadvertently locking a newer package – maybe one that contains breaking changes. Go ahead and generate this yourself with the pipenv lock command:
pipenv lock
Finally, let’s run some of these newly installed packages. In the terminal, you can run Black by telling it which file or folder to work on:
Black $PWD

The keyword Black is needed to run Black itself. The special $PWD statement is a bash command. It stands for ”print working directory.” When used with Black, this means “run Black on all the code from the folder your terminal is currently in.”

Notice how this command failed? Something along the lines of “bash: black: command not found”? This is because Black is installed in your virtual environment. You need to enter this virtual environment to use the packages in it. Do so with Pipenv shell :
pipenv shell
Notice how your terminal now says the name of your virtual environment in brackets at the start of each line. Yours may vary slightly, but for me, my terminal now looks like this:
(Pi-Car) bash-3.2$

This is a helpful reminder that you are in a virtual environment. Go ahead and run that Black command again. If everything is working correctly, Black will say that it has nothing to do. This is perfectly fine – you have no Python files to format yet (but if you did, Black is ready to format them).

Now run some unit tests with Pytest:
pytest

Once again, Pytest will say there are no tests to run. This is not a problem – you simply haven’t written them yet. You didn’t think unit testing would be that easy, did you?

You’re now all set with your virtual environment. It works. You can install packages and update your pipfile and pipfile.lock. One final step is to point PyCharm to your fresh environment. From the bottom right of PyCharm, click Python 2.7 and choose Add Interpreter. Choose Virtualenv Environment from the left-hand menu and then choose Existing Environment. Find your version of Python (3.7) listed in the Interpreter drop-down menu and then click OK. This menu is shown in Figure 4-15.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig15_HTML.jpg
Figure 4-15

PyCharm’s “Add Python Interpreter” menu

PyCharm will do some figuring out and perhaps recalculate its indexes. When it’s ready, the bottom right of PyCharm will now say “Python 3.7,” followed by the name of your virtual environment. Now you can use a virtual environment in the terminal, or with any buttons and tools within PyCharm itself.

Note

You may have noticed a Pipenv option from the PyCharm interpreter menu, so why must you select Virtualenv instead? Pipenv is really a wrapper around the Python package Virtualenv. By using Pipenv, it’s really created your virtual environments with Virtualenv in the background (along with a few other helpful things). Feel free to select Pipenv and create a new virtual environment this way instead. It’s not possible to point PyCharm to an existing Pipenv virtual environment without selecting Virtualenv. Knowing how Pipenv works in the background is a much better place to be in than blindly relying on PyCharm to operate it for you in either case.

Git Workflow

Now that you have two new files, let’s get these into your Git repository. You could just crudely shove these into the master branch, but that’s not very refined. Many businesses are not willing to tolerate code going to a customer without a code review – other developers looking and approving of the changes.

You could argue that this code is just for you, and there’s no other developers working on it, so you’ll shove it wherever you please. This is a valid approach, yet as you’ll see shortly, your Raspberry Pi will look at the master branch. If you blindly cram unstable or even unfinished code into it, your Pi may not work until you finish the code. This may be acceptable just for you, but again, if this was a real product, website, or business, things randomly breaking is not an acceptable option.

By creating a branch, you can safely work and experiment on code away from the main branches, and all the other developers and systems which rely on them. A common system is a master branch, which is your production code. A develop branch is used for nearly ready or possibly unstable changes. The code goes into a feature branch (often stemming from develop) while it is developed. When ready to share, it gets pull requested (PR’d) to develop. You won’t need to get other developer’s approvals if it’s just you on this project, but PRs provide an easy way to undo a whole feature at a time.

Once merged into develop , you can pull request (PR) to master if you’re ready to go “live.” A final option is that of urgent bug fixes. If there’s a problem on the master branch (and therefore your production code), this can be fixed with a branch taken off master. This is known as a hotfix, and it goes back to master, bypassing develop. Once the crisis is over, you can merge it from master down to develop, to keep everything in sync.

With all that said, let’s go ahead and create a develop branch. These projects won’t use feature branches, but feel free to if you’d like the practice (the Git process remains unchanged). From the PyCharm terminal, create a new local branch with the git checkout command:
git checkout -b develop

This creates a new branch called develop. The checkout command lets you switch between local branches, and by using the -b flag, you can create a new branch with a name – in this case, “develop.” Notice how Git confirms the command it just executed, and the bottom right of your PyCharm window shows the current branch.

Use git status to see a list of files that have changed and are either staged already or untracked and uncommitted, shown in Figure 4-16:
git status
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig16_HTML.jpg
Figure 4-16

Untracked files highlighted by Git

This returns three files so far (or more if you’ve made other changes):
.idea/
Pipfile
Pipfile.lock
The Pipfile and Pipfile.lock are expected changes, but what is the .idea folder, and where did it come from? This is a PyCharm folder. You don’t need it in your repository, and you can safely ignore it. To ignore this folder from all your Git commands, you need to add it to your .gitignore. Remember, GitHub automatically generated a Python-specific .gitignore file for you. Add this folder to your .gitignore file with this command:
echo .idea/  >> .gitignore

This is a bash command to add “.idea/” to your .gitignore file. You don’t need to open the file; this will append to the end of it. Now when you check the git status, you’ll see three files, but with a difference. The .idea folder has gone, but it’s now been replaced by your .gitignore file. You’ve modified this file, so you now need to commit it.

Committing in Git saves your changes at a point in time. This only happens to your local Git repository, so it’s not the final say, but it’s an important command to understand. Before you can commit changes, however, you need to tell Git which files to commit. You could use something like this:
git add .
But this is a bad practice. This command adds ALL changed files to your local Git. You may accidentally commit testing files, passwords, repository or access keys, or any other number of files you don’t want to share. By explicitly adding the files you want to commit, you can avoid all of these issues. Use git add to let Git know which files you want to commit:
git add .gitignore
git add Pipfile
git add Pipfile.lock
If you run git status again, you’ll see all three of these files have turned green, indicating that they have been staged ready to commit (shown in Figure 4-17). Commit them with git commit:
git commit -m "First pipfile"
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig17_HTML.jpg
Figure 4-17

Tracked files highlighted by Git

The -m option lets you type a message to accompany your commit. Make this descriptive so that you can understand what this change is, and any other developers who may look at this Git history in the future. It sounds simple but lacking the discipline to write good messages here will cause you trouble later on – either when you’re trying to track down what changes occurred in a repo, or you start working with other people, who look at you quizzically as to why you let your cat walk all over your keyboard with regard to your commit messages.

The final step needed is to push these changes. This tells Git to get all your changes and then commit and send it to the remote repository – GitHub in this case. This gets the code off your machine. It backs it up remotely and ensures other developers can see and work with it. Without pushing to a remote repository, your code exists only on your computer.

As this branch does not exist in GitHub just yet, you need to tell Git to make the branch on the remote repository first with the --set-upstream option:
git push --set-upstream origin develop
Providing you’re not violating any repository rules configured on GitHub, and there are no surprises (such as a bad Internet connection, or an outage at GitHub), then you’ll see a push confirmation message, highlighted in Figure 4-18. These may vary in time depending on the number and size of the changes you are pushing.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig18_HTML.jpg
Figure 4-18

Console output during the first push of code to a remote repository

When pushing to a repository that already exists on the remote, you don’t need to be so specific:
git push origin develop

As a good rule of thumb, you should avoid pushing without specifying where to push to. While a plain “git push” often works perfectly fine, it pushes directly to the remote branch – which may not be what you expect. By explicitly stating the branch to use, you avoid any risk of shoving your changes into the wrong place.

Pull Requests

As discussed previously, pull requests are the best way to get your code into another branch. They allow other developers to see and approve of your changes, and they provide an excellent way to discuss the merits of said code. Developers can comment on specific lines or changes, and everyone gets to see your changes with a nicely formatted output. Even if you’re not working with other developers, pull requests are a great way to move code from one branch to another.

Before starting a pull request, you need to get all the changes from the upstream branch into your branch. If you’re going from develop into master, then develop needs all the changes in master, plus your new additions. This places the onus of resolving conflicts and integrating all changes on the pull request author. Solo developers will rarely encounter any issues, but once you join a team, it’s common for a branch to fall behind the main branch fairly quickly. Keeping your branch updated with upstream changes is a good practice to employ.

Let’s merge a change from master to develop locally, and then pull request results. Begin by switching to the master branch (assuming you have no uncommitted work on the current branch):
git checkout master
Now get the latest changes from the remote master branch to your local master branch:
git pull
Switch back to your local develop:
git checkout develop
Now merge your (local) master into your (local) develop:
git merge master
If your changes are small, this merge will continue without issue. Should there be significant changes in either branch, then you may end up with a merge conflict. Merge conflicts happen when two branches contain changes to the same file and within the same lines. Git cannot automatically merge the files for you in some cases, so you need to manually do so. This is easy in PyCharm. Head over to VCSGITResolve Conflicts. This brings up the PyCharm Conflict menu, shown in Figure 4-19. It lists all the files that have conflicts. You can choose to accept your changes or changes from the remote. If you need to pick and choose changes from both branches, then double-click each file and work through each change.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig19_HTML.jpg
Figure 4-19

Git conflicts highlighted by PyCharm

Once the conflicts are resolved, commit the changes:
git commit -m "Merged master"
and then push your branch:
git push origin develop

At this point in time (and providing nobody sneaks new changes into master), your develop branch contains your changes plus all the changes in master. Develop is now ready to pull request into master. This is the stage where I’ll deviate from a command-line only Git tutorial. Git purists may start hating now, but creating PRs from the command line requires even more Git knowledge than is needed to start using Git right now. GitHub shows the changes between branches in a much clearer way than the stock Bash command line does, so now feels like an appropriate time to use a GUI.

Head over to http://GitHub.com and select your project. From the Project Navigation bar (Figure 4-20), choose the branches button. This is preceded by a number, matching the number of open branches your remote repository has. This is unaffected by how many local branches you have providing they are not pushed to the remote.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig20_HTML.jpg
Figure 4-20

GitHub Project Navigation bar

This overview lists all your branches, and is shown in Figure 4-21. Find the develop branch (or any branch you wish to pull request from), and click New pull request on the right-hand side. This loads the Create pull request page, shown in Figure 4-22.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig21_HTML.jpg
Figure 4-21

The GitHub branch overview page

../images/488914_1_En_4_Chapter/488914_1_En_4_Fig22_HTML.jpg
Figure 4-22

GitHub’s new pull request page

This page is split into two sections. The top half contains details of your pull request, the number of reviewers, pull request messages, labels, assignees, and more. The bottom half outlines the number of commits made and the changes they occurred on. It highlights any additions or subtractions, along with any commit messages made on this branch.

In the top half, enter an appropriate title and description. What’s an appropriate description? It is one that lets other developers know the purpose of this pull request. There are no hard and fast rules here, but a good place to start is these prompts:
  1. 1.

    Reason – Why are you making this PR? What problem does it solve?

     
  2. 2.

    Rational – Why did you code it this way? Was there another option that wouldn’t work in this scenario?

     
  3. 3.

    Background – Other developers may not have the context of a fix that you do.

     
  4. 4.

    High-level list of changes (if significant).

     
  5. 5.

    Any future changes to follow, or pull requests in other repositories to accompany this change?

     

As a final sanity check, take a look at your commits and the diff. If this was someone else’s code, what would you say? Have you left any debug or test logic in the code? Is it ready to run in the destination branch, or production? It’s OK to go back and make changes before finishing this PR. Once ready, choose the big green Create pull request button to share your changes with the world.

Once opened, it’s time to sit back and bask in that sweet sweet karma that comes from being an awesome developer – go you! On a serious note, it’s a bad practice to merge PRs without sign-off from other developers in your team. Rules and regulations vary between businesses and teams, but it’s not uncommon to face disciplinary action for failing to get enough approvals on a pull request (accidents excluded). If you’re working on a personal project without any other developers, go ahead and merge this PR with the big green Merge pull request button. Providing there are no conflicts (there shouldn’t be at this stage), your PR will merge without issue. If you do encounter any conflicts, then revisit some of the previous steps to merge and resolve them.

To keep your repository tidy, it’s a good idea to close feature branches when merging PRs. This keeps branches around for a small purpose or feature. Branches cost nothing, but having lots of old ones around can clutter up your repository. Create new branches as and when you need them. Common branches master and develop (and sometimes others) rarely get closed.

Repository Rules

Any good repository needs rules. Rules stop you committing directly to the master branch, or merging a pull request without enough approvals. Once again, when working solo you may not need as many rules, but they can still keep you safe. This project uses CI/CD to automatically deploy code from the master branch to the Pi (as you’ll see in the next chapter). Any changes to the master branch could result in the Pi not working properly. Sure, a home project on the Pi failing isn’t a big deal, but what about a system customers are paying for, a military missile, or a medical system? Small mistakes here could cost lives (as illustrated in some of the software development case studies).

It’s hard enough to write bug-free software, so save yourself the trouble and protect your branches! At the very least, branch permissions prevent you from accidentally committing work-in-progress code to the wrong branch.

To add branch permissions, head over to http://GitHub.com and load up your repository. Choose Settings from the top navigation bar (as shown in Figure 4-23). From the left navigation bar, select Branches and you’ll see the Branch protection rules section. Any existing branch rules will appear here. Let’s create a new branch rule by choosing Add rule.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig23_HTML.jpg
Figure 4-23

GitHub’s top navigation bar

This add branch rule screen lets you configure in-depth rules for a branch. It’s possible to have a whole host of rules for your master branch, and no rules at all for develop. It doesn’t make sense to have a blanket rule policy for the whole repository. Let’s add rules to the master branch and learn about the possible rule options in the process. Figure 4-24 shows the branch protection rules page.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig24_HTML.jpg
Figure 4-24

Adding a branch protection rule in GitHub

The Branch name pattern defines what branch to apply this new rule to. You can use the wildcard * operator here to match patterns, but for now, master is sufficient.

Require pull request reviews before merging means that any pull request must have a certain number of approvals before it can merge. Not very helpful for solo developers, but incredibly useful for a team. The Required approval reviews drop-down lets you specify how many reviews are needed to merge. By enabling Dismiss stale pull request approvals when new commits are pushed, GitHub will reset the approvals if the code changes while a PR is open. Finally, Require review from code owners means a designated code owner has to approve any PRs in addition to the other rules. This is useful for any benevolent dictators to have the final say over a code change.

Next, Require status checks to pass before merging means your branch must meet any status checks before it can merge. This could be a limit on the number of conflicts present, or that all the code from the destination branch has already merged into the source branch. This is easy to meet if you follow the Git steps on previous pages.

Require signed commits enforces a cryptographically verifiable signature across all commits. This ensures users can trust project authors – not something typically enforced on private projects.

The Require linear history option means all commits merging in a PR go into the destination branch as one commit, instead of several. This keeps the main branch tidy if there are lots of changes going in.

The Include administrators option applies these rules globally to any user, or only to users who are not project administrators.

Finally, at the bottom is the Rules applied to everyone including administrators section. Confusingly this doesn’t include the Include administrators option from the previous section. Inside here, Allow force pushes lets you allow or disable force pushing. Force pushes let you shove code into a branch that may not otherwise go. This could be commits which don’t have the matching commit history or don’t meet a particular rule. It’s not advisable to enable this.

The Allow deletions option lets users delete this branch. You probably don’t want this to happen on develop or master branches when using CI/CD.

As a good starting point, you should require pull request reviews before merging, with a suitable number of approvals – even one at this moment is sufficient. Enable Dismiss stale pull request approvals when new commits are pushed, and require status checks to pass. If these rules are too prescriptive for you, or not strict enough to prevent problems, then loosen or tighten them as you see fit once you get into the flow of Git and your project.

GitHub’s Project Administration guide available at https://help.github.com/en/github/administering-a-repository goes into extensive detail with all of these options are more.

Pipeline Configuration

A pipeline or build step is an essential part of a CI/CD process. Tools such as Jenkins (https://jenkins.io/) or Travis CI (https://travis-ci.com/) exist to let you completely automate the deployment process. From running various tests and building other services to building and deploying images, and rolling back changes, pipeline automation tools can save you a serious amount of time and effort. While a large automation is little out of the scope of this book, let’s discover how to run your unit tests automatically across any branch.

Before configuring a pipeline process, let’s write a basic unit test first, so your pipeline has something to run. Open up PyCharm and create a new branch off develop with a suitable name:
git checkout develop
git pull
git checkout -b feat/tests
Create a new test folder at the top level inside your Pi-Car folder:
mkdir tests
Create two new files inside this folder. Pytest requires all test files to begin with the word “test”, so create a test_practice.py file. The second file is a blank file called __init__.py:
touch tests/test_practice.py
touch tests/__init__.py
The touch command creates blank files. Why __init__.py? This special file tells your Python interpreter that this folder contains Python modules. You only need one per directory. They are often blank, and are an essential requirement for working with Python. Go ahead and create another __init__.py file at the top level:
touch __init__.py
Inside test_practice.py, create a very basic class and test:
class TestPractice:
    def test_one(self):
        assert 1 == 1
        assert "banana" == "banana"
Black your code and then run this first test with Pytest:
black $PWD
pytest tests
This command tells Pytest to run all the tests it finds in the tests folder, and its output is shown in Figure 4-25. You could replace this with any other folder and it will run the tests – providing the test names meet Pytest’s standards. Notice how Pytest gives you a nice little test status? All your tests should pass, and Pytest should indicate so with a green bar and a 100% sign. Let’s make these tests fail. The Assert statement is used to test that a certain condition is met. Change one of the asserts to something that is not a fact:
assert "apple" == "banana"
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig25_HTML.jpg
Figure 4-25

Passing unit tests

Now run your tests again – notice everything goes red (Figure 4-26)? Pytest highlights exactly which test failed, and why. This information is extremely useful when writing tests, or troubleshooting a failing test. Go ahead and undo this failing change. Commit your change and push your new branch:
git add __init__.py && git add tests/
git commit -m "Wrote a basic unit test"
git push --set-upstream origin feat/tests
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig26_HTML.jpg
Figure 4-26

Failing unit tests

Now you have a unit test. You have a local and remote branch called feat/tests, and you know how to write assertions. Pull request and merge your branch into master – you’ll need it there for this next step. Let’s get GitHub to run these tests in your pipeline.

GitHub’s built-in pipeline process provides basic CI/CD functionality. The GitHub Marketplace (https://github.com/marketplace) is full of premium and free tools to provide extensive control over deployments, but GitHub Actions is the quickest way to get started with (simple) CI/CD in GitHub, and it’s built in to the core offering. You can see the introductory page in Figure 4-27.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig27_HTML.jpg
Figure 4-27

GitHub’s initial Actions page

Head over to your repository and choose Actions from the top navigation bar. Here you’ll find several boilerplate actions to get you started. Skip all of these, and choose the Set up a workflow yourself button from the top right of this page. This next interface can look daunting (Figure 4-28), but it’s not as complex as it looks. This lets you configure exact actions to happen when a branch merges. This could be deploying to Amazon Web Services (AWS), sending an email, or in this case, running tests.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig28_HTML.jpg
Figure 4-28

Configuring a build pipeline in GitHub

GitHub Actions are configured using a YAML file. By default, GitHub proposes you store this nested under .github/workflows and call it main.yml. You can change this name and location from the top left of this action configuration page. Alternatively, you can commit this file as is and then pull and work on it locally. For now, let’s edit it online and let’s leave its default file name and location. The benefit of editing this file in the GitHub Actions file editor is automatic syntax correction. If you make a mistake, or enter an invalid (but technically correct) command, GitHub will inform you of the problem.

YAML is a recursive acronym for YAML ain’t markup language. It’s a human-readable configuration language, used for config files and build pipelines. It’s simple to use and nests commands through indentation, colons, and hyphens.

GitHub’s actions are free (with some usage limits). You’re unlikely to reach these limits for small projects or accounts where you rarely need to build multiple branches concurrently. Actions support a huge amount of customization. You can trigger builds when any code changes, or just when certain branches run. You can name your tasks, limit the execution time, run more than one in parallel, and lots more. Here’s the starter code you need to run your unit tests:
name: Validate Build
on: [push, pull_request]
jobs:
  build:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        python-version: [3.7]
    steps:
    - uses: actions/checkout@v1
    - name: Set up Python ${{ matrix.python-version }}
      uses: actions/setup-python@v1
      with:
        python-version: ${{ matrix.python-version }}
    - name: Install Dependencies
      run: |
        python -m pip install --upgrade pip
        pip install pipenv
        python -m pipenv install --dev --pre --system
        export PYTHONPATH="$PWD"
    - name: Black Check
      run: |
        black --check $PWD
    - name: Unit Tests
      run:
        pytest tests

Let’s break this down. This .yml file configures your build pipeline to run in GitHub. Each individual line represents a configuration parameter. Each line starts with its name, followed by a colon, and then the value. Line breaks only exist to make it easier to read, and commands are further subdivided and nested with tabs and hyphens – a bit like Python.

It’s possible to run multiple pipelines, each one performing a different task. You can name a pipeline like this:
name: Validate Build
The on command specifies when to automatically run this pipeline. You can put various options here. The push command will run the pipeline whenever code is pushed to the repo. Pull_request runs this pipeline whenever a pull request is made. This is a good starting point in terms of useful places and times to run this pipeline:
on: [push, pull_request]
All the following commands are nested underneath the jobs section. This is simply a grouping, and way to separate the pipeline actions from the configuration and metadata:
Jobs:
Everything underneath the build section is used to configure the prerequisites needed to run this pipeline. This is a good place to specify the operating system and Python versions you want to test with. The runs-on option lets you choose the operating system to test with – in this case, the latest version of Ubuntu Linux. The matrix strategy offers you the ability to run tests against several different versions of Python, specified by the version numbers in square brackets ([3.7]). This build only tests against Python version 3.7, but you can enter many different Python versions here (separated by commas):
runs-on: ubuntu-latest
    strategy:
      matrix:
        python-version: [3.7]
The steps section configures the build steps. This installs specific versions of Python, along with your project dependencies. It runs your unit tests and checks the code that meets Black’s stringent requirements. If any of these steps fail, the build will also fail. The uses name configures the pipeline to require a dependency. Here’s how the pipeline gets the latest version of your code:
- uses: actions/checkout@v1
The Install Dependencies step configures pip to install your project dependencies. It adds several options to Pipenv and then exports your Python path – just like you did on your computer:
  - name: Install Dependencies
      run:
        python -m pip install --upgrade pip
        pip install pipenv
        python -m pipenv install --dev --pre --system
        export PYTHONPATH="$PWD"
The Black Check step checks your code complies with the Black code formatter. By using the --check option, Black won’t actually change any of your code – it just checks there is nothing to change:
  - name: Black Check
      run:
        black --check $PWD
Finally, run the unit tests with Pytest:
    - name: Unit Tests
      run:
        pytest tests
When you’re ready, choose Start commit from the right-hand side of the actions screen, and fill in the commit details, as shown in Figure 4-29. Make sure you create a new branch, instead of committing to master.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig29_HTML.jpg
Figure 4-29

Committing a change though GitHub’s web-based interface

To see this pipeline in action, head over to the Actions tab from the main repository view (Figure 4-30). Here you can see all the historical builds, their status, and any pipeline names. On the left you can filter by the pipeline names – as defined at the top of your .yml file. On the right are all the builds that have run in the past, ordered in chronological order. You can filter these pipelines either by their name or other search criteria or by using the predefined filters for event, status, branch, and actor (person who made a change that kicked off the build).
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig30_HTML.jpg
Figure 4-30

The Actions screen on GitHub

On the list of pipelines, you can see supplementary information about that build. The green tick or red cross indicates if the build was successful or not. Next is the name, followed by the most recent commit in that build. The event that triggered this build (such as pull request or code push) is listed after the words on. In the middle lists the branch this pipeline ran against, and the user who changed something to trigger it. Finally, the right-hand side lists the duration of this build, along with the time and date it last ran. The three ellipses on the far right will take you to the workflow configuration file once expanded.

By clicking the build name, you can see detailed pipeline information – shown in figure 4-31. By clicking the Python version number on the left-hand side, you can see detailed build information. Here’s where you’ll find the output of Black or Pytest, alongside any errors. This is a valuable troubleshooting tool when configuring a new pipeline.
../images/488914_1_En_4_Chapter/488914_1_En_4_Fig31_HTML.jpg
Figure 4-31

Sample output of a GitHub pipeline

When you’re ready, pull request and merge this pipeline branch to the master branch. Congratulations! You are now well on your way to becoming an expert software developer. You’ll use this pipeline regularly to assess the quality of your code. Whenever you push new code or create a pull request, this pipeline will run. It will run your latest code and tests, so you don’t need to worry about updating it. You may want to regularly maintain it – adding new versions of Python as they become available, or fine-tuning it to your preferences.

Git Cheat Sheet

Now that you know enough Git to be dangerous, refer back to this cheat sheet (Table 4-1) for a helpful reminder of the basics.
Table 4-1

A list of common Git commands

Command

Description

git init

Initialize a new git repository in the current folder.

git add <file>

Stage a single file into your local branch.

git add .

Add ALL files recognized by Git. Localized to your current directory. Use wisely!

git commit -m "commit_message"

Commit your staged changes with a message.

git commit

Commit your staged changes without a message (opens up your default console text editor with a sample message).

git push origin develop

Push your staged changes and commits to a specific remote branch.

git push

Push your staged changes and commits to whatever your remote branch is. Use wisely!

git push --set-upstream origin <branch_name>

Push your staged changes and commits to a new remote branch.

git checkout <branch_name>

Switch to a preexisting local branch.

git checkout -b <branch_name>

Create a new local branch and switch to it.

git remote -v

List the remote branches and repositories your local branch is linked to.

Chapter Summary

This chapter has equipped you with everything necessary to write the code for this project. You used Pyenv and Pipenv to manage different versions of Python, and configure virtual environments for per-project dependency management. You created your own Git repository on GitHub, and learned how to create pull requests, and configured a build pipeline to run your unit tests automatically.

You and your computer are now fully equipped to develop any Python application you like, using industry-standard tools and best practices. In the next chapter, you’ll continue this configuration by preparing the Pi itself. You’ll learn how to install an operating system, install the latest version of Python, and pull your application code and run it on the Pi itself.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.234.62