Archive for category technology

Which Programming Language Should I Learn?

This question is asked by many new aspiring programmers. There's been many different answers to this question. Some people explain that the language isn't as important as a basic understanding of algorithms and programming paradigms. I agree with this, but it's no the complete picture.

What are you going to be doing with your programming skill? Are you going to use it to obtain a full-time job? Are you going to use it to work on your own projects in your free time? Many people would answer that they have some great app idea that they want to implement. Others can't even tell you why they want to learn to program.

If you are planning to make money in some way from your new skill, you'll probably want to look into which languages are most in demand. For instance, I program in PHP and Javascript because that seems to be what I can find the most work doing. The slippery slope of this is that you eventually move into specializing in a specific language even more. If you are working with PHP as a full time job, you rarely have an opportunity to explore new languages. You're also unable to claim experience in other languages because you've spent years working with one in particular.

So, my advice is to explore other languages as often as possible. If you are starting out, pick a language that has a lot of online discussion. Something like Java or Python would do nicely. These are well supported, very commonly discussed, and are easy enough to learn. Use this first language to learn specific things about programming. You can learn the syntax of the language as you learn algorithms and paradigms. Explore the language enough to understand the concepts of programming. You don't have to become an expert in the language, but you should focus on understanding the abstract ideas used in modern programming.

I started out learning Perl and Bash scripting a long time ago. I soon changed over to VBScript and VB.NET because I was working for a Microsoft shop and needed to do some scripting and app development. I lacked a lot of the basic knowledge that I should have been working on, but I thought that I needed to learn languages.

A language is nothing more than a set of syntactical rules for structuring your thoughts. Think of learning French. You could teach someone all the words needed to speak French. You could teach them how to structure a sentence, but that isn't enough for the person to know how to communicate. Take this English sentence as an example:

You jumped over the computer.

It's a complete sentence. It has a subject, a verb, and is a complete thought. However it wouldn't make sense if you were to say that after being asked:

Are you coming over after work?

You have all the necessary skills to create a full working sentence, but you have no idea how to communicate with another human. That is similar to learning a programming language. You can write "Hello, World", but can you do something as simple as writing a recursive algorithm? Can you find more efficient ways to do things? There's a lot more to programming than the language. The language is only important later on when you need to choose a better tool for a specific task. You may need to write a website with a lot of real time interactions. NodeJS. You may want to write a game for iOS. Objective C or Swift. You may need to write an Android app. Java. You may need to write some mission critical optimized system code. C, C++, or Assembly. You may need to interface with some off the wall E-Ticket system and generate some Excel reports on a Windows-only domain. VBScript or .NET.

The point is unless you are in the same job for the rest of your life and the world doesn't change, you'll probably end up needing to know a dozen or so languages at least. An introduction to programming, should be in pure pseudocode, because I think beginning programmers get too caught up in the language and miss the more important details of programming that is being taught.

At the same time, I've had introductory to programming type classes that I look back on now and realize that the teachers had no idea what they were trying to teach. They didn't even understand the concepts themselves enough to explain them to someone else.

»crosslinked«

No Comments

Back to Linux

I have had an on again, off again love affair with Linux since 1998. It has been an enlightening experience plagues with numerous installations and CD/DVD ISO burns. I remember the first real exposure I had to Linux. I was living with my grandfather. I was working in a factory. My hobbies were computers and playing guitar. There wasn't a lot to my life at that time. I worked, wasted money, and slept. I also chatted on IRC quite a bit. Back then, I was still on dialup so that was about the best thing to do with my internet connection.

I had a friend who some could call a bad influence, but to me, at the time, he felt like a mentor and older advisor. He knew all the ins and outs of IRC, especially on the Undernet IRC network. He went by the nick "Fud", short for "fear, uncertainty, and doubt".  He had Eggdrop IRC bots and Energy Mech IRC bots. He had shell accounts. He had knowledge of things to which I'd never been exposed.

I wanted to create my own IRC bots. So he introduced me to the idea of shell accounts. I remember configuring my first Eggdrop bot. That seemed like the most complicated thing I'd ever attempted. It didn't help that I was configuring the bot in an operating system that I'd never been exposed to at all...Unix. Unix was a mysterious word to me because I'd had no college classes and my high school had barely covered DOS.

I had learned DOS on my own. I didn't have any friends who knew DOS. My dad had bought government surplus 8088 computer and when we turned it on, we were greeted with a command line. We didn't have a clue, so I typed "help" and pressed enter. The output presented to me with that command is how I learned DOS. I felt like such a hacker, mainly because it was a retired government computer and I thought I may be able to find some kind of interesting data on it. It was really a cool way to learn DOS.

Jumping back to '98 when I was first getting into Linux, I was presented with a command prompt that looked nothing like the familiar DOS prompt that I had used before. Typing "help" didn't help. Luckily I had Fud there to help get me started with basic commands. Some were similar to DOS. Others where completely different. I was used to typing cd to change directory but in DOS to change to the parent directory (move up the directory structure) you can type cd.. all together. In Linux this would give you an error. It was required to put a space between the cd and the .. which took some getting used to.

I loved the idea of Linux right from the beginning. I felt excited mainly because I was learning Unix, that mysterious operating system that serious computer geeks knew about and a rural hick like myself had never seen. I was also excited to break the chains from Microsoft. I mean imagine it, an operating system that's completely free. Since I'd always wanted to be a programmer, there was also the added bonus that the source code was also available for most of the programs that came with the OS. Side note: I know some of you are chomping at the bit to tell me that Linux is the kernel, not the OS. Get over it. Everyone calls it Linux.

Ah my early days with Linux. Like I said earlier, I was still on dial-up, which presented two problems. First, downloading Linux was impossible. An ISO file of Linux was at least the full size of a CD-ROM back then, and for some distributions it was multiple disks for an install. It takes a very long time to download 720 megabytes when your download speed is roughly three or four kilobytes per second. So the only real way to try out many different distributions was to order them from places like CheapBytes. I bought a pack of about 10 different distros and tried them all. It included: Debian, Slackware, Red Hat, Mandrake, and others.

The second problem with dial-up and Linux dealt with dial-up modems themselves. Most modems at that time were "WinModems". They weren't "hardware" modems. They were interfaces for phone lines and such, but the actual modem functionality was handled by Windows itself. They were hardware interfaces for software modem code. These wouldn't work in Linux, and honestly they weren't as good as real hardware modems. Most WinModems used a PCI bus, and the hardware modems used the older ISA bus. The hardware modems also usually had hardware DIP switches for configuring interrupt settings and such. Hardware modems were superior, but WinModems were cheaper. So most people were using WinModems. The first thing a new Linux user back then had to do was purchase a hardware modem.

I dual-booted back then, but still stayed primarily in Windows. There was still very little compatibility with lots of hardware and commercial games just weren't available. However, I was able to learn a lot during that time. I started learning Perl. I created my own IRC bot in Perl. My first preferred Linux distribution coming from Windows was Mandrake. At the time, it used KDE and was pretty user-friendly. I learned to hate RPM. Mandrake changed its name to Mandriva, and at some point decided to charge for using it. So I switched to Debian.

I love Debian. It has been my favorite grand-daddy Linux distribution since I moved away from Mandrake. I've tried just about every major distribution that exists. Some of my favorites along the way were Gentoo, Arch, and Sabayon. However, as soon as Ubuntu came out, I, like many other Linux users, switched to it. It quickly became the most popular distribution. Ubuntu took Debian, which was already pretty easy to use, and made it even easier. Around 2004 or 2005, I started using Ubuntu 100% of the time at home. This went on for about three years. In 2007 I took a desktop support role in the IS department of the Manufacturing company I had been working at for about 9 years. Most of this desktop support dealt with Windows, so Windows became a primary OS on my home computer once again.

I've glossing over many things during this, but it's so I could get to this point. I started working for myself as a web developer and internet marketer in 2009. At that time, I had the personal freedom to use whatever operating system I saw fit to use. I used both Windows and Linux, but gravitated toward Linux most of the time. As a developing environment for web applications Linux is by far my favorite. My career path was gradually moving in a direction that allowed me to use Linux full time. I interviewed for a position with a travel agency in 2011 as a PHP developer. I went to work there and was pleasantly surprised that everyone was using their favorite OS on their work machines. Some people were using Windows 7. Others were using a flavor of Linux. I picked Ubuntu and installed it on my work machine.

It was around this time that Ubuntu started using Unity as its main desktop environment. I'd been very content using Gnome 2. Unity brought cool features with it, but it also seemed to be a huge buggy mess. I had many problems with it, both at work and at home. I switched to XFCE at work for a bit, but eventually switched to Windows 7. That's where I stayed. After seven months of working for that travel agency, I decided that Orlando, FL just wasn't for me (I had moved to Orlando to take the job), and I moved back to north GA. I was again working for myself, but this time I continued using Windows 7. Oh I tried to use Linux, but I had so many issues with the new desktop environments and the new forks of Xorg that I just gave up on it.

These distributions were all trying to make their desktops work well on mobile devices. As a result, they made things suck on the desktop. Gnome 2 had been perfectly stable and usable. I really liked it. Now Gnome 2 wasn't an option. You could use Gnome 3 in classic mode which made it look like Gnome 2 a bit, but that wasn't really the problem. It wasn't that I didn't like the new look. It was beautiful. The problem was that it just didn't work well. Dual screens worked fine and dandy in Gnome 2. Unity and Gnome 3 choked on them, or just handled things poorly. The push toward mobile device integration by the Linux community, nearly killed desktop Linux for me. I hated it. I was so mad at it, not that there was an actual entity to be mad at. I was just mad at what I viewed as pure stupidity.

We had Android. It was already the ultimate Linux mobile platform, but everyone else wanted to get in on the action as well. This could have been accomplished by having a separated mobile desktop environment, but nah, we need to force everyone to change. We need to take perfectly stable working desktop environments and throw them out, replacing them with desktop environments meant for mobile devices that are unstable.

I'd like to point out that of the major desktop operating systems, the only entity that got this right was Apple. They made iOS for their mobile devices and OS X stayed on the desktop. Sometimes change isn't a good thing. Apple got this transition correct. Microsoft screwed the pooch with their Windows 8 introduction as well. What were all these people thinking? Did they think the desktop was already dead and that everyone was already using only mobile devices?  I think they lost sight of the fact that people still listen to radio, even though TV was invented. People still watch TV even though desktop computers and the internet were invented. People still use desktop computers even though mobile devices are now in wide-scale use. Why would you screw over the primary user of your operating system just to try to get a foothold in a mobile device market which is already dominated by Apple and Android? Let me backtrack a little there. I'm not saying that they should try to obtain some market share in the mobile device market. I'm saying that they should have wrecked their desktop environment to do so.

Between 2012 and September of 2015, I used Windows 7 nearly 100% of the time. Occasionally, I would install a new version of Ubuntu or LinuxMint, hoping that it would be good enough to turn me from my Windows desktop. After all, I was programming and my projects revolved around LAMP stacks. I still used Linux, but it was in the form of virtual machines with no desktop environment. I had completely given up on the Linux desktop.

In early September 2015, I installed LinuxMint on a spare 120GB SSD, and for some reason everything just worked again. The Cinnamon desktop, which I had tried out a few times during my Windows 7 years, seemed to be stable and user-friendly. I left Windows installed on my main 500GB SSD just in case. Three weeks passed and I hadn't booted into Windows.

Today, I reinstalled LinuxMint. This time, I removed Windows and set that 500GB SSD as my /home partition. I'm again Windows free and loving it. Sure, there will be some Steam games that I can no longer play because there are no Linux versions, but I also don't have Windows 10 spying on my every move. I also now have a much better working environment for my development work.

So for anyone else who may have given up on Linux a few years ago, go give it another shot. You may enjoy it.

No Comments

New Session Cookie Created on Every Page Refresh in CodeIgniter

CodeIgniter's way of handling session data is slick, and I use it a lot. However on my current project, I went overboard on my configuration changes and accidentally caused a problem that had me scratching my head for a few minutes. I noticed that session data wasn't persisting and that my sessions table (I opted for database storage of my session data) was filling up with new rows of session data every time I reloaded a page in my project. This prevented my login functionality from working.

The solution to my problem was a configuration detail. I had set $config['cookie_domain'] to the domain name I will eventually use for the site. CodeIgniter didn't like this because my development environment is not on that domain. So it was creating new cookie/session data every time I loaded a page. The problem made sense after I thought about it for a bit. I remembered that I had set a few extra settings in the config, and sure enough, that was the winner.

The problem can happen when other settings are incorrect as well. So pay close attention to those settings, and look there first if you notice that sessions are being created on every page load.

, ,

1 Comment

Clickbank Analytic Software

There's a site called cb-analytics.com which has always been a great resource for information on Clickbank products. However, I've always found the site hard to navigate and I wanted a site that showed some of the "hot" clickbank products. So, I've written a site called cbniches.com which I hope will rectify these issues.

The site shows all the latest products in each category and shows gravity and earnings per sale. It also has a graph for each product to show gravity over time. I think this will be pretty helpful to affiliate markets looking for new products to promote on Clickbank. Check it out at http://cbniches.com. I wrote it with the latest version of my LavaPHP framework, another product I've been developing as open source. LavaPHP can be found on github, but it's still in early development.

, , , ,

No Comments

How to play a Bluray movie in Debian Testing

This HOWTO will probably work in Debian Squeeze, but the system I used was running Wheezy. Hopefully this HOWTO won't be necessary long, but until then, it's a very good way to watch blurays on your Linux system. This will also work for Mac and Windows, with some changes, but this is strictly a Debian Testing HOWTO.

There are various ways to do what we need to do. Some require you to rip the bluray first and then watch the resulting MKV. I found that this took 15 minutes or so on my 8-core system. However, there is another method which uses a program called makemkv and pipes the output to VLC using a network stream. I found this method on the web but found that some of the links were screwed up, so I couldn't use it directly. After some searching, I found the script I was looking for and edited it slightly. For your convenience, I'm rewriting the HOWTO and including the files all here.

First of all, create a folder called makemkv in your home directory. You can use a different location if you prefer but for simplicity we'll put it in our home directory.

mkdir ~/makemkv

Change into that directory and grab two zipped tarballs from the makemkv author. Then extract them.

cd ~/makemkv
wget http://www.makemkv.com/download/makemkv_v1.7.2_bin.tar.gz
wget http://www.makemkv.com/download/makemkv_v1.7.2_oss.tar.gz
tar xvf makemkv_v1.7.2_bin.tar.gz
tar xvf makemkv_v1.7.2_oss.tar.gz

Yes you need both the bin and the src.

Make sure you have some dependencies.

sudo apt-get install build-essential libc6-dev libssl-dev libgl1-mesa-dev libqt4-dev curl vlc

In other HOWTOs, the curl dependency isn't mentioned. You need curl for the script we will download later.

Next compile and installed the two packages.

cd makemkv_v1.6.12_oss
make -f makefile.linux
sudo make -f makefile.linux install
cd ../makemkv_v1.6.12_bin
make -f makefile.linux
sudo make -f makefile.linux install

Finally download and run playBluRay.sh. I gzipped it so the site wouldn't complain about the file type. Just gunzip it and execute it.

It will run makemkvcon to decrypt the bluray and setup a stream on port 51000 of your computer. Then it will start vlc using the network stream. It may take a few moments to load it all.

Note that sometimes a bluray won't play correctly even with this method or you may see a behind the scenes segment before the movie. I had this problem on my Rambo bluray.

, , ,

1 Comment

Marte Engine TextEntity setColor()

I've been focusing on learning Java game development over the last couple of weeks. I've found that lwjgl, Slick2d, and Marte Engine are great libraries to help with the basic game functionality. In fact, they take a lot of the work out of it and leave you to focus on the game design itself. For instance, the Marte engine comes with a great resource manager class that helps keep up with images, sounds, and spritesheets for your game. Marte also has some good classes which extend Slick2D's Entity class which can be very useful.

The one I'm going to focus on is TextEntity, which is great for adding text to the screen. However, I was unable to change the text color for the text directly using the setColor() method from the Entity class, which is inherited by TextEntity. There's not a lot of documentation for either Slick2D or Marte, so I wasn't exactly sure if I was missing something or if I had found a problem with the class.

Fixing the problem is rather simple. I created a new class called MyText and copied everything over from the TextEntity class. I could have extended it, but instead I wanted a new class which extends Entity directly. Then I changed the code in the render() method as such:

1
2
3
4
5
6
7
8
9
10
11
public void render(GameContainer container, Graphics g)	throws SlickException {
	if (font == null) {
		font = container.getDefaultFont();
		this.calculateHitBox();
	}
	g.setFont(font);
	if (text != null) {
		g.setColor(this.getColor());
		g.drawString(text, x, y);
        }
}

I then added a new constructor that takes a color as the fifth parameter:

1
2
3
4
5
6
public MyText(float x, float y, Font font, String text, Color color) {
	super(x,y);
	this.setColor(color);
	this.font = font;
	this.setText(text);
}

This gave me the functionality I was after. This change basically just sets the graphics object color property to the Entity's color property, which is exactly how I thought it should have worked to begin with.

If there's a better way to accomplish this, let me know. I'm only intermediate with Java.

No Comments

Linux Desktop UI Options

I'm detest the current path of the Linux desktop. There are reasons that the desktop environment programmers and the distribution makers have taken the path they are on, but I disagree with those reasons. Their reasoning involves unifying the user interface for desktops, laptops, tablets, and other mobile devices.

The main problem I see with that reasoning is that they are hurting the desktop environment in the process, and the desktop/laptop is the only place their products have a future. There's no need for a revamping of the desktop interface to make it more like a mobile device's interface. The tablet/phone interfaces are designed for multi-touch interaction from the user. THE ADVANTAGE of a DESKTOP is having a good keyboard and mouse. The desktop UI is designed for superior input methods. Mult-touch is cool and all, but it is designed to give mobile devices an interface to mimic what you can already do on the desktop and laptop with the keyboard and mouse. Multi-touch on a desktop is cool, but it has only been implemented on OS X on Macs, and they didn't change the actual interface itself to accomplish this. They just added multi-touch capability to the UI that was already there.

I could see good reason to make this change to the default Linux user interface if the distributions were going to be used primarily on mobile devices. BUT they aren't. The mobile market has two very strong operating systems. Those are iOS and Android. Android is Linux itself, but its user interface is perfect for a mobile device. It also has the convenience of the Google marketplace. It is the defacto Linux distro for mobile devices. There's no demand for Ubuntu on a mobile device like a tablet or phone. Android does everything you could want to do on those devices, and it does those things well.

So, what is the point to unify the user interface on the Linux distribution, if the only real use you're going to see for these distributions is in the desktop and laptop markets? There's absolutely no reason to do this.

The new unified user interface, namely Unity and Gnome 3, are clunky at best in a desktop environment. They are a downgrade from the previous user interfaces that were popular, especially Gnome 2.

For this reason, I see a shift in the primary desktop environment used by most Linux users. I think most users will switch to XFCE4 or one of the other DEs which were similar to Gnome 2. I would be willing to wager that within a year or two, most distributions will be using XFCE4 as a default desktop environment by popular demand from their userbase. Either that will happen, or the desktop environment creators will see the error of their ways and change back to the old desktop standard.

BRING BACK XORG and just improve it!

, ,

No Comments

Edwin Jagger DE89 Review

I've written about my experience with double-edge safety razors in the past, but I felt that I should write another post detailing the advantages of the safety razor over the popular multi-bladed razors of today. You could chalk it up to growing old, but I've realized that a lot of the technology of the past was superior to the modern technology. This is especially true when it comes to razors.

Shaving is one of those fascinating things for the young. I remember wondering what it was like to shave when I was a kid. It seemed like fun. As far back as I can remember, my dad used an electric razor. So, when I grew up, that was what I first tried. Electric razors are great when you first buy them. They don't shave as close as other methods but they are at least simple. I found that after a few uses, they more or less just ripped the hairs out rather than cutting them. They also require lots of cleaning and make a mess.

So I switched to Gillette Sensors. This was one of the first multi-bladed razors, and I liked the results. I would get ingrown hairs and slight irritation, but I was younger then and it didn't bother me that much.

Then I started losing my hair, and I promised myself at an early age that if I ever started going bald that I would help the process along and just start shaving my head. So I started shaving myhead. My hair is very thin on top now but the sides are still rather thick, which makes shaving difficult in those areas at times.

The problem with shaving your head is that ingrown hairs suck on your head. So that problem became a major issue for me. I started reading around and found that the cause of the ingrown hairs was the type of razor I was using.

Multi-bladed razors were advertised as having the ability to raise the hair up before cutting it. This process causes the hair to be cut below the top of the skin. When the hair grows back, it can sometimes grown back into the skin at an angle, especially in areas where the grain of the hair goes in various directions. I have this problem on the back and front of my neck. If I let these hairs grow out, I would have an areas of curly hair.

So I switched to a double-edged safety razor back in September, and I've had great results.

Advantages

Safety razors are good solid tools. The one I bought is chrome and very solid. It is a Edwin Jagger DE89. It is the first and probably the only safety razor I'll ever buy. I bought it, a bar of shaving soap, 100 blades, and a badger hair brush for around $50. That sounds like a high price for a razor, but I've not had to spend another dime on shaving equipment since then and I won't have to buy anything for at least another year. The soap lasts a long time and costs $1.00 a bar. I'm still using the same bar after more than four months. I've only used around 20 blades or so. One hundred blades costs around $9. I still have a lot of blades and they should last me another year or two. Ten bucks for enough blades to last you over a year is awesome, especially if you've ever bought a 4-pack of MachIII or Fusion blades. So cost is a major advantage with the double-edge.

Another advantage is the shaving experience. I take my time. I pay close attention to the shaving process, and it's relaxing. To me, it turned shaving from a chore into a rewarding activity.  I lather up my face, make a single pass with the grain, wash it off, lather again, make another pass against the grain, wash it off, and then do the same thing for my head. I get a super close shave.

I also don't have a problem with ingrown hairs like I used to. I rarely cut myself. There was one occasion when I was shaving my upper lip sideways and ended up cutting my lip a bit, but that was my fault. I don't shave my upper lip very often, and normally keep a Van Dyck. So I wasn't very used to shaving there. Other than that, I get very little nicks from shaving with a double edge. The main thing is to keep your face wet, your razor wet, add lots of lather, and don't press on the razor. Also, take your time.

Disadvantages

"Take your time" leads me to the one and only disadvantage of shaving with a double-edge razor. It takes longer. Shaving with a multi-bladed razor is really fast. With a double-edge you need to take your time and concentrate on what you are doing more. Some people won't like this, but I very much enjoy the added time. I usually take around 10 - 15 minutes to shave my face and head. I always finish it off with an aftershave lotion (non-alcohol-based).

Conclusion

Like old keyboards, old razors are just better than their modern equivalents. You don't have to buy an old razor though. There are plenty of manufacturers still making double-edge razors. The Edwin Jagger DE89 was the razor I selected after much shopping online, and I have been very pleased with it. The weight makes it feel like a knife through butter when you first shave with it. You may be used to pressing down with a Mach III or Fusion razor. With the Edwin Jagger it'll feel like you are just letting it cut the hair for you at first. I was very happy with it from the beginning, and it hasn't let me down since. I strong recommend it. There are many good double-edge razors on the market, but I know you can't go wrong with the Edwin Jagger DE89.

, ,

No Comments

Best Godaddy Alternative Registrar

Many of you may be die-hard GoDaddy users. GoDaddy's support of the SOPA has led many people to start looking for a new registrar, including myself. After a boycott, GoDaddy broke away from its support of SOPA. However, their initial support for the legislation was enough to turn me away. I do not plan to do any further business with them and will be slowly migrating my existing sites over to another registrar as time goes by.

In my search for a GoDaddy alternative, I found that I really like NameCheap.com. They offer good rates on domain transfers and registration. I also like their control panel much better than GoDaddy. They don't have as much up-selling going on when you register a domain. I always found that annoying about GoDaddy. I also didn't care for GoDaddy's domain manager. Once, I tried out GoDaddy's Windows hosting as well. It was terrible. So if you are looking for a viable alternative to GoDaddy, you should definitely give NameCheap.com a try.

, ,

No Comments

Rosewill RK-9000 Review 2

After a day of using the RK-9000, I can say that it is definitely the best keyboard I've ever typed on. The couple of problems I had with it in part 1 of my review have cleared up. There have been no more issues with sticking keys, and I've gotten used to the smaller keyboard size. The smaller size has actually increased my typing dexterity and speed.

I've been using http://typeracer.com to calculate my overall WPM speed. I started our around 58, which is a bit low for me. Since then, I've brought my average typing speed up to around 75wpm which is about the same as my Unicomp average. However, I've spiked on some races with 89wpm, and it's not uncommon for me to hit over 80wpm in a race. This is faster than my Unicomp speeds. I think the speeds will gradually increase as I become more accustomed to using the RK-9000.

Overall, this may be the best $100 my sister has ever spent on me. The keyboard is an absolute joy.

, ,

No Comments