Understanding password requirements

The Cybersecurity and Infrastructure Security Agency (CISA) just published new password requirements and it can be counterintuitive to say the least. Unfortunately, it doesn’t seem like they did a great job of explaining why things are changing and what the changes mean, so I’ll do so here.

For at least five years now, researchers have been showing that complexity and rotational requirements, special characters like ^%$! and rotating to new passwords every 90 days(or 30, 60, 120), is actually less secure. This is where it becomes counterintuitive and when trying to just say we’re going to change the password requirements based on new standards falls flat. You would assume that making things more complex and changing them makes a password more secure, and you would be right in a technical sense. When taking the overall picture, though, this breaks down. Special characters means passwords are harder to come up with and harder to recall and being forced to change them four times a year is a burden, especially when you consider the average user has to have passwords for 30 different things between their work and personal life. (LOOK THIS UP!) So, the average user with their average amount of passwords is now making it as easy as possible to remember and that means using the same password everywhere they can and as often as possible, only changing up with a different character. Most people in IT know that Spring2022! is pretty insecure, but usually works within password requirement restraints. Oh, and then when it’s time to change we’ll just go ahead and make it Summer2022!. Problem solved.

So, the likelihood of a breach becomes higher because your users are now reusing their Pinterest password, bank password, and work domain password. Cybersecurity does not exist in a bubble and neither should the policies we live by. I’m sure from a technical standpoint password requirements can be created to not include common words, dates, and whatever else we could dream up, but users will probably still write them down and lose them, email them to themselves, put it in a spreadsheet with all their other passwords in a Google Doc, or whatever else they could do to make it easier. They flow in the path of the least resistance and we need to give that to them. In the case of passwords, the path of least resistance is longer but with fewer constraints. At least 12 characters, but with no requirement for special characters and don’t reset it unless there’s some need. Tell them to use a phrase that makes sense to them. I look around my office walls for words to make into phrases.

Fiscally speaking, you aren’t going to have as many calls coming into your IT help desk or lone IT person for help with a password. If you can cut down on workload you can keep from having to add staff or be able to concentrate on projects and priorities that matter to the organization more than helping someone reset their password four times after they reset it because they can’t recall it or wrote it down incorrectly (once again, writing it down!)

Your coworkers are not your customers.

Perhaps some of the problems between IT and the rest of a company are caused by treating non-IT as customers like the IT department is not just a separate function in the organization, but a separate organization itself. I’ve been part of internal IT departments that call other employees in the same company, “customers” and I think it’s probably a net negative for all involved. Those of us in the IT department don’t feel part of the company’s mission, or even that we have to be part of it. Our colleagues are no longer people we can relate to, but are a number on a scorecard or just the enemy of our systems. On the other side, in the company as a whole, I suppose it’s just that they don’t need to treat IT staff like human beings, which is a little bit of an exaggeration and I suppose it doesn’t really stop someone who would be an asshole anyway. For end-users, though it does get frustrating. They don’t think IT understands the business or their priorities and they are right.

I get the motivation to treat end-users as customers. First off, end-users as a term doesn’t sound that great. Mostly, though, it’s that the IT leadership wants the best possible service for the company. Calling people customers just seems to be a lazy way to say that’s happening. If they are a customer and customers are always right, then the service will be great! Not so true.

Ultimately, treating your fellow employees like customers is keeping personal relationships from occurring. No one is seeing the IT department as part of the company, so there isn’t any engagement. IT sees itself as a punching bag and the other departments see IT as a creator of rules and holder up of progress. If the IT department can start seeing itself as part of the company and the rest of the company does the same, we can get back to using technology to solve business problems instead of just being one.

History of my home lab

Putting together a history for  two reasons: 1) I’m giving a talk on Wednesday and 2) It’s fun

My home lab traces back to my first Macbook in 2008. I wanted to run Windows, for reasons I can’t recall but probably were just, “I can’t”. A friend down the hall in my dorm did IT support for the college and gave me a key for VMWare Fusion. I never got Windows installed, but did get Ubuntu and Fedora installed. I was slightly familiar with GNU/Linux before and had even tinkered a little bit, but having a virtual machine on my laptop was next level.

Fast forward almost ten years and I’m working in technical support. Want to push my skills past a help desk into the world of cybersecurity. Someone gave me a list of tools to learn and off I went. Spun up virtual machines on my gaming desktop to learn tools and then learn the fundamentals and more. Slightly backward, but hey I’m getting there. Someday I’ll figure out how these computer things really work.

Today that gaming PC is just a sticker-laden shell of what it was. The case is there, but that may be it. It’s now a Frankenstein of server parts: An AMD Opteron 16 core server processor, 32GB of ECC RAM, a hodgepodge of hard drives, a cool server motherboard with IPMI, and four NIC’s. That’s just the big server. I have some Raspberry Pi’s, an Nvidia Jetson Nano board for AI development, laptops, small computers, and a full stack of Cisco Meraki networking gear.

What’s it doing lately? I host Plex on my NAS because I constantly blow away my hypervisor for some reason. The biggest benefit is that the NAS sucks less power I assume. Plus, it’s always going to be on anyway. The big server is mostly used for testing these days. I’m running NextCloud on a small computer with an Ubuntu server image. Another small computer is hosting the Security Onion stack as a SIEM.

Home lab resources

Giving a presentation on getting your feet wet in home labs, so put together a list of resources. Feel free to add to it!

Reddit.com/r/homelab is a great place for help, reassurance, community, pretty pictures

Check out this new to post: https://www.reddit.com/r/homelab/comments/5gz4yp/stumbled_into_rhomelab_start_here/

They also have a wiki: https://www.reddit.com/r/homelab/wiki/index

 https://www.reddit.com/r/selfhosted/ is also a good place to get inspiration/ideas

Hypervisor(This is where your lab systems go to live and you go to play!)

 VirtualBox. Great for your laptop or desktop and can easily spin things up.

https://www.virtualbox.org/

 VMWare. The defacto standard I’ve seen in business is Vsphere ESXi. They also have workstation products like Fusion for Mac and Workstation for PC. Some items are free and it’s a solid type 1 hypervisor for home use. I’d start with this for a dedicated box!

https://docs.vmware.com/en/VMware-vSphere/7.0/com.vmware.esxi.install.doc/GUID-016E39C1-E8DB-486A-A235-55CAB242C351.html

 Proxmox. An open source hypervisor built on Debian(Ubuntu’s parent). Uses custom layers to work with Linux KVM. A solid choice for a homelab and really popular with r/homelab folks. I’m currently using this.

 Straight KVM. Cowboy up!

System and software images:

Windows 10 dev environment, which is great for testing and playing around with. 90-day license.

https://developer.microsoft.com/en-us/microsoft-edge/tools/vms/

 Windows servers. Server 2019 trial. Hyper-V is free for perpetual use.

https://www.microsoft.com/en-US/evalcenter/evaluate-windows-server-2019?filetype=ISO

 Want to hop Linux distros and play with either the most popular or strangest variants of GNU/Linux?

https://distrowatch.com/

 Ubuntu

I recommend the LTS (long-term support) version that’s most current

https://ubuntu.com/download/server

https://ubuntu.com/download/desktop

I would go with Ubuntu if you want to learn Linux or play with Linux. You will be told there is easier to set up distros and there are, but nothing is as popular which means you can always find an answer. You’ll be compiling your own kernel and pointing out distributions are for amateurs in no time. (no time being a relative and subjective term; my no time = 10 years)

CentOS is a good alternative for servers, is the latest upstream(or downstream? I don’t remember; it’s just slightly more bleeding edge) of RedHat, which has been the default enterprise Linux I’ve seen in the United States. It really does not matter; I thought at some point I should learn CentOS instead of Ubuntu or whatever else, but under the hood, it’s about all the same. 

Networking:

Whatever you got! Really, do not worry too much about it for now.

Ubiquiti gear is great if you want to spend money. Works great for home and business. Lots of dashboards and easy to use.

PFSense. This is when you want to start getting into the weeds. It’s easy to set and forget, but if you want to start tinkering you can go all out.

Cisco? Chances are high the company you work for uses Cisco. You can get gear super cheap for your lab on craigslist usually. Or ebay. Anything less than 10 years-old should be okay. 

Buying stuff

Ebay.com Craigslist.org Facebook marketplace Goodwill Tech recycling places (in GR we have CompRenew and it’s like a nerd vacation for me every time I go.) https://www.reddit.com/r/hardwareswap/

Watch out for things that may require a license. A lot of things work without licensing, but all the fancy bells and whistles get turned off.

Internet In A Box

One of my big projects for the year is to create what I’m calling, Internet In a Box. It’s a self-contained device able to provide social networking and collaboration to small groups. Its intent is to be used by protestors in the event their government turns off the internet and cellular service.
Currently, the project is shifting from planning to development and consists of a Raspberry Pi with Debian serving NextCloud’s collaboration suite via Apache and MariaDB along with Dolphin open-source social networking. Load testing on the Raspberry Pi is proving to be the toughest hurdle thus far. I will either change hardware (wireless at least) or attempt changes in NextCloud. There is a NextCloud image specifically for Raspberry Pi’s that can be tested, as well.
Down the line, I would like to examine distributed storage and compute. Currently, the focus will be on the core product, which is an isolated sharing platform.

What is Zero Trust?

Pretty sure I know what zero trust is as I use it. Can I define zero trust, though? We’ll find out.

You’ve probably heard at least one person in IT grumble, “Never trust, always verify.” And that’s about the bulk of zero trust. There’s a whole NIST publication (NIST SP 800-207 Zero Trust Architecture) on zero trust and that’s what I slogged through, but really you just need to remember what that person grumbled in a meeting about cybersecurity three years ago. What that saying means is that everything on the network is verified for identity and context every time it tries to access a resource.

To really get it I had to take a little trip down memory lane to what apparently was only the year 2018 when there was no zero trust and everything was implicitly trusted (not true, but it’s my story so stay with me). AdminJane logged into the VPN while on vacation in Eastern Europe so she could do some server updating. After logging in through SSH using her admin username and a password she could update that fancy web server and then jump on over to the print server no questions asked. And apparently, that’s how things were back in the day. Once you get through the firewall everything on the network just assumes you are good to go. But what if AdminJane isn’t really on vacation and all that happens at 3 am? Probably not cool, but apparently once you are in you are in. Nefarious or not. Was it really like this? ehh, not really. But close; and for the most part, it still is pretty close today.

Okay, so back in the day you get through the perimeter and you’re into everything. With zero-trust, though, all that communication between enterprise resources is checked and double-checked and, of course, encrypted. Not only does it ask for the correct password from AdminJane (and hopefully now some multifactor), but it also uses some context. That fancy $15k next-gen firewall is making some choices now and some of those include location and time. Ever get an alert that someone tried to use your credit card number and it was caught because you used it in Kalamazoo, Michigan but then someone tried to use it in Sacramento, California three minutes later? Your firewall is doing something like that. Once you get in, though, the verifying continues. Various parts of your security infrastructure are working together to constantly be vigilant. Identity and access management is making sure the right resources go to the right person when they need them, the firewall makes sure that person isn’t using ports they don’t need at times and places they shouldn’t, the SIEM is pulling all the info and tracking. On and on in what is now called the Software Defined Perimeter.

The one thing you should keep in mind is that zero trust isn’t a set of tools. Zero trust is a process and culture. Zero Trust is the idea that something shouldn’t be inherently trusted just because it made it through the front door and everything you do to implement it should be based on that idea.

Extra Life Game Day Roundup

Extra Life game day has come and gone now. Every year on the second Saturday of November people go full telethon for 24 hours of live gaming. Usually, I spend it at the hospital raising money by dungeon mastering some DnD games throughout the day. This year, of course, is a little different and I finally jumped on the streaming bandwagon. It was much less of a shitshow than I assumed and was a lot of fun. I played some Darkest Dungeon on the PC to start the day and ended with a live reading of some H.P. Lovecraft for a worldwide audience of 2 on Twitch. All in all, I had a good time and people were kind enough to open their pocket books to donate to a good cause.

You can still donate at https://www.extra-life.org/participant/BenStitt and while there you can see some clips of my reading and playing.

Linux Gaming

I’m trying, I promise!

Really, I want to do it badly. I’ve wanted to just go full-time desktop Linux for 12 years now, but gaming is so hard! I love tinkering and prefer that to gaming, so why is it so tough?

I’m currently running Fedora 32 on my gaming PC, but dual booting with Windows for just about anything that isn’t from Steam or GoG. I dig my CoD and I’m sure someone gets it working, but that someone is not me. Anything with anti-cheat tech just doesn’t want to play at all.

I also tried doing GamerOS on a little mini PC I had around. Thought I’d go ahead and stream the heck out Steam games as my wifi is so good. Nope! I think that might be a hardware thing, though, because it runs everything like garbage.

One day! One day soon, I hope!

Managing Windows applications with Chocolatey

I’m a big fan of package managers and if you’ve ever used Linux or even an app store, you probably are too. Chocolatey is that solution for Windows. It’s simple to install at the command line ( https://chocolatey.org/install ) and can be used as a command (choco install adobereader, googlechrome), in a PowerShell script, in an initial image config file, or as part of your CI/CD pipeline. I’m constantly blowing away my workstations and currently use it in an initial workstation setup script I pull from my GitHub. As you can see from my script at https://github.com/GalacticDeep/workstationsetup it’s incredibly simple. There’s a lot more that can be done to set up workstations or application servers with chocolatey and it’s on my project list to use it in my pipeline with Ansible to keep from package configuration drift on Windows servers. Also, it’s on the to-do to set up the Windows end of the imaging server to use Chocolately to more easily update applications on the golden Windows 10 and Windows server image.