Trying out CRC (Code Ready Containers) to run OpenShift 4.x locally

I've been working a bit with Red Hat lately, and one of the products that has intrigued me is their OpenShift Kubernetes platform; it's kind of like Kubernetes, but made to be more palatable and UI-driven... at least that's my initial take after taking it for a spin both using Minishift (which works with OpenShift 3.x), and CRC (which works with OpenShift 4.x).

Because it took me a bit of time to figure out a few details in testing things with OpenShift 4.1 and CRC, I thought I'd write up a blog post detailing my learning process. It might help someone else who wants to get things going locally!

CRC System Requirements

First things first, you need a decent workstation to run OpenShift 4. The minimum requirements are 4 vCPUs, 8 GB RAM, and 35 GB disk space. And even with that, I constantly saw HyperKit (the VM backend CRC uses) consuming 100-200% CPU and 12+ GB of RAM (sheesh!).

Installing CRC

Before you can use CRC, you must have a Red Hat Developer account. Sign up on the Red Hat Developers Registration site. You can then visit the CRC install page on to view the CRC Getting Started Guide, download the platform-specific crc binary, and copy your 'pull secret' (which is required during setup).

After you download the crc binary and place it somewhere in your $PATH, you have to run the following commands:

$ crc setup
$ crc start

crc setup creates a ~/.crc directory, and crc start will prompt you for your pull secret (which you must get from your account on at the bottom of the CRC Installer page.

Note that during crc setup, I had a big scary warning pop up on my Mac about the binary code signature being changed (this is from a malware detection system built into Little Snitch):

CRC has been modified terminal warning

I clicked 'Accept Modification', as the modification is part of CRC's install process... but hopefully that's something that could be avoided in future versions of CRC!

After a few minutes, crc start prints out some cluster information, like:

INFO Starting OpenShift cluster ... [waiting 3m]
INFO To access the cluster using 'oc', run 'eval $(crc oc-env) && oc login -u kubeadmin -p [redacted] https://api.crc.testing:6443'
INFO Access the OpenShift web-console here: https://console-openshift-console.apps-crc.testing
INFO Login to the console with user: kubeadmin, password: [redacted]
CodeReady Containers instance is running

If you open the web console (https://console-openshift-console.apps-crc.testing), you'll need to accept the self-signed certificate... then you'll be redirected to another local URL, and you'll need to accept another self-signed certificate. But once that's done you should arrive at a login screen:

CRC Login screen

Click on the kube:admin option, then log in using the credentials output by crc start. Note that if you get an error the first time you click on the kube:admin option, try again in a minute or two; the cluster still might be initializing.

Now you should arrive at the OpenShift dashboard:

CRC OpenShift dashboard project list

Running an application on OpenShift locally

One nice thing about OpenShift is that you can manage most everything via the UI, if you desire (assuming the user you are logged in as has the RBAC rights to do so); you can also inspect all the raw resources (either in OpenShift's layout or as YAML in an in-browser editor) pretty easily.

Since I fancy myself a PHP developer, I decided to deploy an example CakePHP application (never used CakePHP before, though):

  1. I created a project with name php-test and display name 'PHP Test'.
  2. On the Project Status page, I chose to browse the application catalog to find something to deploy.
  3. I scrolled down to PHP, clicked it, and chose 'Create Application'.
  4. I named it php-test, left all the defaults (except for checking the 'create route' checkbox to create a public URL), and clicked the 'Try Sample' button to test out the example repo (which happens to be a demo CakePHP app).
  5. I waited

With this application, it requires some time to pull the base image, run a container to pull the source, run another container to build the project (using PHP's package manager, Composer), and finally run the final container so the PHP test Pod is ready to start responding to HTTP requests.

You can monitor the progress of the build by clicking on the DeploymentConfig (DC) 'php-test', and then inspecting the 'Resources':

CRC Openshift project application resource build progress monitoring

It took a few minutes for the build process to complete (there's a lot to download, apparently!), but once that was done, I could visit http://php-test-php-test.apps-crc.testing/ and see the running application:

CRC OpenShift demo CakePHP application

It's interesting, I've never seen 'Routes' in K8s nomenclature (I'm used to 'Ingress'), but it seems to be about the same, just with some extra decoration for some OpenShift-specific features. I could also manually add a Kubernetes Ingress resource that allowed me to direct a domain (e.g. http://php.test/) to the php-test service running in the php-test namespace.

Managing the cluster

After you're finished, you can run crc stop to stop the cluster, and crc delete to completely delete it. crc start then generates a new cluster (or restarts a stopped cluster).

Getting the CRC VM IP address

I wanted to get the IP address of the HyperKit VM, too, so I could access exposed services (either by NodePort or Ingress via DNS), and there are three quick ways to do that:

  1. In the OpenShift UI, go to Compute > Nodes, and look at the 'Internal IP' under 'Node Addresses'.
  2. Run crc ip.
  3. Ping the console URL (e.g. ping console-openshift-console.apps-crc.testing).


If you want to explore the RHCOS Linux, that the OpenShift 4.1.11 of the CRC is running on:

ssh -i ~/.crc/cache/crc_libvirt_4.1.11/id_rsa_crc core@
sudo -i

thx for this writeup.
I installed crc in a centos vm on ESXi. Do you know how I can make a project accessible from outside of the VM?
I have seen these IPs: IP CentOS VM IP CodeReady Containers VM
172.30.161.x Cluster IP
10.128.1.x POD IPs


If you add a NodePort you should be able to access the service behind it at[NodePort-here]

How did you run crc setup on a linux vm...i keep getting an error that virtualization flag is not set in bios.

You need to enable v-tex ( virtualization ) in the BIOS. If you are using a VM then you need to enabled nested virtualization. You can check how to enable in the documentation of of virtualization software you are using. I have run into multiple errors when using nested virtualization, I would suggest to install on laptop.

Setup HAproxy and add a second init to your dnsmasq.d that points to your host for the wildcard domains. That's what I'm doing.

Do you know whether it is possible to login to the crc-VM. If it is possible which credentials should be used for the login?

Today I tried to install CRC on my upgraded Mac running Catalina (10.15.1), and I'm running into a dialog that blocks me from running crc setup, which reads:

"crc" cannot be opened because the developer cannot be verified.

macOS cannot verify that this app is free from malware.

Safari downloaded this file today at 9:24 AM

I opened up the following issue in the CRC issue queue: Can't run crc setup on macOS Catalina - "the developer cannot be verified".

In the finder, you must click each binary that does this (I also saw this on kubernetes) and right click them and do open (from the finder) in terminal... you then wait for the first run of the binary to complete and then you will have authorized it to run. Then you can do crc setup.

DO you know how to allow the developer to login to the CRC console? After I install it and everything seems OK. the kubeadmin can login to console but the user developer with its default password "developer" cannot login to console. In the OCP 3.x, I don't remember my user cannot login to the console. It was just some restrictions showing if the user is not an admin user in a project.

I have installed and setup CRC OpenShift 4 cluster on RHEL.

I can see 51 projects from "oc projects " command.

I am unable to open the console "https://console-openshift-console.apps-crc.testing/".

Getting below error "This site can’t be reached"

Please help.


After setting up the CRC OCP 4 cluster on RHEL, I am unable to open the console. I can run oc commands successfully but can't seem to open the console.

getting Error "This site can’t be reached"