We have a need to expand our home network and to place another switch in the house. Normally I would run a CAT-6 cable and call it job done, but I thought I try something different and run a fibre optic cable.
My existing UniFi Switch 24 POE-250W already has 2 SFP ports, which supports fibre if I purchase the appropriate transceivers. I decided to buy the Unifi 1 Gbps multimode SFP modules (Ubiquiti U Fiber Multi-Mode SFP 1G – UF-MM-1G) just to make sure there are no compatibility issues. There was no point in getting a pair of 10Gbps transceivers because the Unifi Switch can only handle 1 Gbps any ways.
Since I only have one switch that has SFP ports, I needed to buy another device that will receive SFP port and bridge it to a standard RJ45 connection. I found a nice little converter called the TP-Link MC220L Gigabit Media Converter. This worked perfectly.
The only confusing part was the different types of fibre connectors that are out there, and whether to go with single vs multi-mode. Apparently most installations are using LC (Lucent Connector), which is the type that I went with. There was no need for me to go with single mode because I don’t need kilometres of cables.
Everything worked like a charm. It was much easier than I thought. So step into the light and give fibre a try!
I have two boys enrolled in York Regional District School Board (YRDSB). Both are in high school, Kalen is in grade 9 and Jason is in grade 11. As with their peers both had to accommodate their learning habits in the new age of the Covid-19 pandemic. Both have recorded their experience during this crisis, and you can read their perspectives:
As an observer of their new habits during the pandemic I noticed the following things:
We have to impose a strict schedule that mirrors a regular school day. For example, they have to wake up no later than 9am, and must conduct their studies from 9am to 3pm. They have a lunch break which last between 30 minutes to an hour, and they can use their own discretion to take 15 minute breaks throughout this period. However, if we notice the breaks are being abused, then they are persuaded to continue with their assigned curriculum. This was followed for about 6 weeks, but since Ontario has decided to cancel the remaining school year, the start time of this schedule is slowly creeping to 10am instead of 9am. Without this imposed schedule discipline, they will continue to sleep in until the afternoon.
In subject areas where they are challenged and find the online learning medium to be insufficient of their needs, we hired tutors from Superprof.ca. Even though the tutoring sessions are still remote, the technology employed by the tutors offer a more one on one and real-time access to the material and help. In contrast, all most all the remote learning from YRDSB is based on material delegation. Students are expected to check for online updates and materials, and follow reference links to other self-learning materials, such as power points and PDF documents. Assistance can be obtained through commenting systems or online forums. Although both of my sons are dealing with the situation I think for most students, this is simply woefully inadequate. A live video conferencing medium I think will go a long way here.
The technology employed is under-utilized or insufficient and on the whole subpar to the contemporary online tools that today’s businesses are employing to assist telecommuting. Most teachers are simply inexperienced on how to manage a remote group atmosphere. Students may have the impression that the teachers themselves are being cavalier, so they adopt the attitude of, “Why should I care?”
Group cohesion that is typically experienced within a classroom has disappeared entirely, because no one has access to standardized technology to get together in a live fashion.
The learning motivation has disappeared, since most students feel a lack of recognition for the work that they do put in. Positive enforcements are hard to convey when it is not live.
Like all of us, the Covid-19 situation has caught all of us off guard and many find ourselves unprepared for the crisis. Therefore, it is understandable that our education system falls short in trying to attain the same level of education with the students in a remote setting. In hindsight, it was a good effort, but the goal is simply too ambitious and not enough resources, training, and support to achieve it.
I am not complaining, but simply taking this opportunity to note the observations that were experienced by both Jason and Kalen. I hope by articulating our experiences here, we can help the movers and shakers at YRDSB to formulate an enhanced strategy for the Fall of 2020, as I fear the current situation will continue to persist until a vaccine is widely available.
Over the past few months, I have been peppering the house with the Unifi G3 Flex security camera. They were very easy to install and since they use PoE, they minimize the wiring as well.
To monitor the cameras, I installed the Unifi Video server software on my Ubuntu Server that is up 7 x 24. All of this hardware components are behind my Unifi Security Gateway firewall.
For whatever reason, I had a thought whether Apple HomeKit can talk to these cameras? With a little Google searching, I came across this article. The instructions from the article was a bit outdated, but what peeked my interest was the use of the homebridge server. I already had homebridge service installed on my Ubuntu server because I use the same technique to have HomeKit talked to my home made garage door opener. After looking through the instructions, it looks like all I need to do is:
Configure Unifi Video to enable RTSP streaming for each of the camera that it is managing;
I had four cameras installed so you will see four cameras configured within the Camera-ffmpeg platform. Note that the entire platforms section should be sibling to the accessories and bridge sections.
I restarted the homebridge service with systemctl, and you should see messages similar to:
May 14 15:00:05 avs homebridge[2973]: [2020-5-14 15:00:05] Please add [Dining Room] manually in Home app. Setup Code: 031-45-153
May 14 15:00:05 avs homebridge[2973]: [2020-5-14 15:00:05] Please add [Great Room] manually in Home app. Setup Code: 031-45-153
May 14 15:00:05 avs homebridge[2973]: [2020-5-14 15:00:05] Please add [Kitchen] manually in Home app. Setup Code: 031-45-153
May 14 15:00:05 avs homebridge[2973]: [2020-5-14 15:00:05] Please add [Garage] manually in Home app. Setup Code: 031-45-153
You simply open the Home App and add the above camera accessories using the manual process instead of the scan code method.
If everything works, you should see something like this on the Home App. Below is a screen shot taken from the macOS version of the Home App.
So it is pretty cool to have something working in your home for several months, and because of a thought, I now have a new capability in the house via Apple’s HomeKit Home App.
Today I went about performing my regular server maintenance update and noticed an IP address 202.107.188.12 hitting my Apache web server. Out of pure curiosity I thought I lookup this IP address. After a quick whois command, I found out it came from Xinjiang Province, China, specifically from the http://www.chinanet-online.com ISP.
After more investigation, I noticed that this particular visitor was attempting a ThinkPHP remote code execution attack. I don’t run the ThinkPHP framework. Below is the log recording its attempt.
I have always wondered how many bad actors are out there and whether these bad people will end up attacking my stuff. Well now I know, experienced first hand.
They are now part of our lives. Like shoes, we now have to don one of these before we head out. I don’t disagree with this practice, but it certainly fits the bill of a “new” normal.
We have started to wear a mask when going outside for primarily grocery shopping. My immediate experience with them is the act of putting it on. It immediately felt restrictive. It is not the same as wearing a face covering toque while skiing or going out on a windy, winter day. Those are much more breathable than a mask that is suppose to filter microscopic drops of fluid. Your face immediately gets uncomfortably warm as you recycle more of your own breathes.
If put on loosely, the warm exhalation will escape and causes fogging on your glasses. It took me a couple of days to really experiment and find the best way to wear one so to minimize this effect. I have gotten it to be bearable but still not ideal.
When I check out of the store, simple acts like a smile when saying thank you can no longer be expressed fully. The concealments of such natural expression felt like I was deceiving the person that I’m communicating with. A thank you without a smile seemed incomplete. I wonder how prevalent future misunderstandings this will cause as our face to face interactions become more faceless. Will we be more angrier and less respectful of others like we are when we are in our cars to other drivers? Will this make us a less friendly society with less consideration for others? Will this make us less human to others? I certainly hope not but one has to wonder.
The physical act of checking out made me suddenly discovered that I can no longer use Face ID to authenticate Apple Pay. Luckily I still have my Apple Watch on hand. I know first world problems, but it really makes you realize how often you check your phone, from looking up contact information to monitoring simple notifications.
Aside from my unhelpful whining in regards to sporting one of these and looking like a typical Mortal Kombat character, I do support and believe that wearing one during the Covid-19 pandemic does help to limit the spread of the virus. However, I can’t help but wonder how society will change with this simple facial garment.
Perhaps this is all nothing as we all adjust to this new normal and it is no different than interacting with everyone on a cold blizzard day when everyone’s faces are concealed. However, something tells me that this will be more impactful. How do you feel about wearing a mask?
This past weekend, I was writing a simple HTML5 utility that provides certain videos that I have in my own personal library. The idea is that I can make a limited selection of videos and present it on a web page, so that the user can simply click on the cover art of the video and plays inline on the web page.
I thought it was a pretty simple requirement and I should be able to whip this up in a few minutes. I used the HTML5 <video> tag. Everything worked on my Mac and my iPad. It even worked on my LG OLED TV. However, when it came to the iPhone, mobile Safari would load the movie but it would not play. I was even able to seek through the movie by scrubbing the scroll bar, but when I press the play/pause button, nothing would happen.
After many hours scouring the web, I found out many caveats when serving videos on iOS with the iPhone.
Videos behind an authorized location may not work on an iPhone because when Safari passes this information to the player, the player does not inherit the previously authorized identity. To get around this, I had to create a token based technique, where the main page have to pass this token to a PHP page that checks this token and serves the video contents.
The PHP used to serve the video also need to handle HTTP Range based requests. This wonderful contribution from GitHub really helped me out!
Multichannel audio such as 5.1 audio encoded in AAC will load but not play on my iPhone XS currently running iOS 13. The video will play if I re-encode the audio to either AAC stereo, or 5.1 multichannel in AC3 encoding.
Multiple audio streams also did not work. The iOS player was only happy with a single compatible audio stream.
If you want to make the video autoplay while inlined within the page, it must first be muted.
Lots of things to consider here. I lost about a day and halve researching and experimenting with this, so here it is all recorded just in case I forget in the future. I also hope that this information will help you out as well.
We had a situation. Our original door bell that came with the house from 1999 (more than 20 years old) decided to crack and disintegrate on us last summer.
We used some transparent packaging tape to salvage the button, but last month it too has had enough of the weather.
Once again our 3D printer came to the rescue. First I designed a replacement button in Autodesk Fusion 360 after I meticulously did all the measurements at least three times. Since it was a very small part, after around 20 minutes of printing, I had the replacement ready to go. Here is the final part installed:
I was on the receiving end of some ridicule when I first purchased the 3D printer, but it certainly has come in quite handy.
I had an opportunity recently to install Ubuntu Server on a very old server, a Dell R710 that had 4 native network interfaces and 4 add-on network interfaces, resulting in a total of 8 network interfaces.
During the installation process, the installer did recognize all the physical network interfaces on the machine but because it did not successfully acquire DHCP addresses, I was forced to install Ubuntu without networking.
After the installation, only the loop back (lo) interface existed and all the other physical interfaces were missing. I had to use the netplan command to create the interfaces. This article was of tremendous help. I pretty well just followed its instructions.
I first created the 99-disable-network-config.cfg file with the contents as instructed by the article.
sudo su -
echo "network: {config: disabled}" >> /etc/cloud/cloud.cfg.d/99-disable-network-config.cfg
Followed by editing the 50-cloud-init.yaml file with the following contents:
vim /etc/netplan/50-cloud-init.yaml
# This file is generated from information provided by
# the datasource. Changes to it will not persist across an instance.
# To disable cloud-init's network configuration capabilities, write a file
# /etc/cloud/cloud.cfg.d/99-disable-network-config.cfg with the following:
# network: {config: disabled}
network:
version: 2
renderer: networkd
ethernets:
eno1:
dhcp4: true
Once netplan is configured, I then executed the following command:
netplan generate
netplan apply
Once I rebooted the computer, the eno1 network interface now exists with a provisioned IP from my local DHCP server.
Much of the content in this blog is a verbatim reference from the article but I provided here so that it is more easily searched by me if I ever needed it in the future.
All of this started with one of my neighbour whose laptop broke down. The laptop stopped recognizing its internal SATA connection, so it will not boot. My neighbour ended up booting Windows from an external SSD using a Windows to Go solution to continue to use his laptop.
This somehow got me thinking whether it is possible to boot Windows from an external SSD using a Mac. I knew Bootcamp allows you to create a dual boot scenario on the Mac, but the default procedure requires you to repartition your internal drive space to do so.
With external SSD drives coming down in price, for example you can get a 500GB Samsung T5 now for less than $130 CAD, it would seem a pretty sweet deal to have Windows on the side with your MacBook.
After doing some research, it seems like others have similar ideas. I am not going to detail all the steps, since you can find YouTube videos and other forums that have already done the deed. Instead, the high level process goes something like this:
Use the Bootcamp Assistant App on the Mac to collect all the drivers on a USB stick or a local folder on your Mac. Do not use the wizard. You will need to use the Action menu. See Figure 1 below.
Download a Windows ISO and use a Virtual Machine (e.g. Parallels, VirtualBox, etc.) to install the Windows ISO onto an external SSD drive. I first tried VirtualBox but ran into Catalina permission issues that I could not circumvent. I ended up doing it with Parallels which I will go into details later.
Copy the drivers from the USB stick created in 1 into the desktop of the recently installed Windows on the SSD drive.
Reboot your Mac and hold the option key down before the Apple logo shows and boot into the EFI portion that contains Windows.
Make sure you have an external keyboard and mouse handy because the default Windows install may not recognize the native hardware yet. On my MacBook Air, I had no issues.
Once Windows come up, login and run the Bootcamp setup from the desktop that was originally copied from the USB stick.
Once this is all done, you can dual boot into Windows on the Mac as long as you have that SSD drive handy.
So far everything works, and it is happily installing Visual Studio 2019. I even tried Cortana and the mic and speakers are working well. I did a quick Skype test call and the webcam is working well too.
I do want to document the steps that I performed with Parallels when installing Windows 10 onto the SSD. Those steps were not intuitive.
After this, stop the virtual machine and make the following custom configurations:
Start the Virtual Machine and it will go through the first part of the Windows installation. Once it is completed, it will reboot. Instead of booting from the external media, it will boot from the CD ISO image again. Simply shutdown the VM again and change the boot order again.
Once Windows 10 complete its installation, it will go through a user account setup process. If you are connected to the Internet during this stage, Windows 10 will force you to either use an existing Microsoft account or create one. This is unfortunate, but go ahead and create a temporary one. Remember to create a local administrator account and remove this temporary Microsoft account as the final step of the Windows setup.
Remember to copy the Bootcamp drivers from the USB stick to the Windows desktop before completing and shutting down the virtual machine.
Now you are ready to restart the Mac and dual boot into the external drive by holding the Option key while the machine restarts. The final step is run the Bootcamp Setup.exe program, which should be located inside the Bootcamp folder that your previously copied on the desktop. This is the last step of the Windows configuration on the SSD drive, and you can restart your Mac and dual boot into Windows one final time.
You are now running Windows natively to the Mac’s metal, without any simulators or Virtual Machines. This process is great to revitalize old MacBook’s lying around especially for students who need a Windows computer for their curriculum, but still want to retain their macOS. For more contemporary Mac’s, the small form factor and the speed of the Samsung T5 drive is a great fit for this type of situation. This is very cool!
Update: Potential Trouble with Major Windows Update
I have been told that a major Windows Update could encounter an error and a registry setting is required to fix this. The following page has more information on this. In summary, you have to set the following registry key PortableOperatingSystem from 1 to 0. This key can be found at registry location HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control. Thanks to Martin Little for this very helpful information.
Update: Mac’s with Secure Boot using the T2 Chip
To allow a Mac with the T2 chip to boot from an external drive certain settings have to be made with the Startup Security Utility. This utility can be accessed via the Mac’s recovery mode, under the Utilities menu. You want to disable secure boot and allow for external drive. Since the secure boot is disabled, set a firmware password to prevent a bad actor booting their own operating system with their own Live USB key.
The goal is to create a USB key that contains a Linux based operating system. Any Linux compatible computer can then be booted with this USB key, temporarily borrowing the host computer. The hosted Linux OS can then access an encrypted partition that houses important private information that may be helpful in an emergency. This technique offers the maximum portability of accessible, private information such as your will, financial data, credentials, etc.
I previously had an USB key formatted with an encrypted Mac filesystem storing the same information. However, this is inconvenient because you will need to find a Mac in an emergency situation.
In the Linux community, you can create a Live USB key. The concept is to create an operating system that will run off of the USB key with any computer that you can plug the USB key to. However, many of these Live USB distributions does not remember any changes that you make while using the operating system. The next time you boot from the Live key, all your previous changes are gone, and the Linux environment reverts back to its original, pristine state. To remember the changes during uses, these changes have to be “persisted”.
I started to find the best methodology for creating a Live Linux USB that operates with an encrypted persistent partition.
All the commands in this article has been performed within the Ubuntu 18.04 LTS Desktop install. I installed this version on both VirtualBox and Parallels on the Mac. Both worked beautifully but Parallels has smoother integration with Mac.
I searched for an alternative USB stick and settled for the SanDisk 64GB Ultra Fit USB 3.1 Flash Drive. This new USB stick’s write performance was 4x faster than the Kingston.
After learning more about initramfs hooks, boot loaders, and a refresher on UEFI and BIOS booting process and partition layout strategies for USB storage devices, I decided to roll my own Live USB using the Ubuntu Desktop as a base along with the mkusb tool for the initial layout. The reason for the change is that I already have Ubuntu else where in the house so standardization is probably a better bet.
To improve performance further, I decided that it is not necessary to encrypt the persistent partition where the system configuration updates will be stored. Instead, I will create my own private encrypted partition to store only the private data that requires protection. Article 1, also provided details on how to use the LUKS technology to encrypt any Linux partition, so my exercise with Kali Linux was not a total waste of time.
Before I run mkusb, I needed to install it first by doing the following:
I ran the mkusb tool (after sudo su - )1, with the following options:
We also chose msdos so that more computers will be compatible for booting. Once mkusb is completed, we will need to perform some custom partition layout. We will use the gparted program for this purpose so that the completed partition layout will look something like this:
We first deleted the original usbdata partition and grew the extended partition (/dev/sdb2) to about 18 GB, approximately 6 GB for casper-rw, which the system will store any custom configurations or upgrades since this Live USB key is created. We create another logical partition called Personal that is around 12 GB in size, which will be encrypted and this is where we will store private, sensitive data for emergency use.
The remaining space will be allocated to USBDATA, a last primary partition for normal USB data sharing, the typical use case for a USB stick. We also want to make sure that the other FAT32 (usbboot) partition is not visible in Windows by setting the hidden partition flag. We did that with the gparted program as well.
Once the partition table is completed, we can now encrypt the Personal (/dev/sdb6) partition. For this, we went back to Article 1, which gave us the following instructions.
~# cryptsetup --verbose --verify-passphrase luksFormat /dev/sdb6
WARNING!
This will overwrite data on /dev/sdb6 irrevocably.
Are you sure? (Type uppercase yes): YES
Enter passphrase for /dev/sdb6:
Verify passphrase:
Key slot 0 created.
Command successful.
~# cryptsetup luksOpen /dev/sdb6 myusb
Enter passphrase for /dev/sdb6:
~# mkfs.ext4 -L Personal /dev/mapper/myusb
~# cryptsetup luksClose /dev/mapper/myusb
All Done! Now we have a bootable USB stick that can be booted from any Ubuntu compatible computer. I can store my own personal data in a very safe and private way within the encrypted Personal partition, while any changes I make to the system will be preserved in between the uses of the USB stick. On top of it all, the USB still has 40+ GB (~37.5 GiB) of storage for normal USB transfer usage.
I spent sometime copying some confidential information which I think I will need in an emergency into the Personal partition. I want to duplicate the finished Live USB key, so that both my wife and I will have a copy always available to us on our physical keychain.
I did this on my Mac, and the command to duplicate the USB drive is:
If the USB key ended up to be lost, then whoever picks it up will need to:
Recognize that this is a bootable USB, otherwise it will just seem like 40GB USB Flash Drive;
Get the password needed to login to Linux; I thought about installing two factor authentication but decided not to, because any good hacker can simply access the partition from another Live Key;
If they do mount the partition manually, then they still need to obtain the LUKS key to decrypt the partition; I made the LUKS key to be different than the OS password and is twice as long.
I think the risk is worth the benefit of having critical info around in case of an emergency.
Update: WiFi on MacBooks
It looks like MacBooks uses Broadcom WiFi chips and most Linux distributions do not ship with these drivers. This can be easily solved by loading the following software:
Even with the above software installed, there is still a little ritual:
Launch the “Software and Updates” application;
Select the “Additional Drivers” tab;
Select “do not use this driver” and allow the process to go through and reboot the system;
Re-enter the system and repeat steps 1 & 2, and then select the Broadcom drivers;
Without rebooting, WiFi networks should be available for use
Unfortunately the above ritual will have to be performed every time the Live USB stick is powered off.
Update: Tried Linux Live Kit
I wanted to further customize my Live USB key. Instead of keeping a persistent partition, I thought I would keep a Linux VM at home and ensure that it is up to date and customized. At certain intervals, I would then create a Live USB key from the VM install.
I tried Linux Live Kit, but the results were disappointing. I was able to create a bootable USB key that worked, but the OS did not recognize the MacBook’s keyboard or trackpad. For some reasons, the drivers required did not get bundled during the process. I’ll have to read up on how I can create a Live USB key from scratch rather than depending on these tools, but it is more complicated than I thought, so for now this idea will have to be shelved until I have more time.
1For some reason mkusb will not work with the live persistence if I simply do a sudo mkusb or under a non-root account. The only way that I can get it to work is to run it within a root login session.