<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Preslav Rachev</title>
    <description>The latest articles on Forem by Preslav Rachev (@preslavrachev).</description>
    <link>https://forem.com/preslavrachev</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/preslavrachev"/>
    <language>en</language>
    <item>
      <title>Every End Is a New Beginning</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Tue, 22 Dec 2020 07:57:14 +0000</pubDate>
      <link>https://forem.com/preslavrachev/every-end-is-a-new-beginning-18ek</link>
      <guid>https://forem.com/preslavrachev/every-end-is-a-new-beginning-18ek</guid>
      <description>&lt;p&gt;&lt;em&gt;NOTE: This post originally appeared &lt;a href="https://preslav.me/2020/12/15/every-end-is-a-new-beginning/" rel="noopener noreferrer"&gt;on my blog&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;These past couple of weeks were overly emotional for me. I made a decision I've been thinking about for the better part of the last five years. I decided to quit my job at &lt;a href="https://www.ki-labs.com/" rel="noopener noreferrer"&gt;KI labs&lt;/a&gt; and starting from 2021 onward, I will take a deep-dive into &lt;strong&gt;running my own independent software business&lt;/strong&gt;. I want to focus on what business has always been and should be about - &lt;strong&gt;the people&lt;/strong&gt;. Not the money, neither the fancy tech nor the fast-ticking numbers.It should be all about the individuals needing a solution and those working hard to build it.&lt;/p&gt;

&lt;p&gt;I know. To do something like that at the end of such a turbulent year seems reckless. But just look around. If 2020 has not made it clear enough, no money or material possessions on the planet will make you happy if you cannot share your joy or moments of grief with the people around you. If this year has shown us one thing, it is that life is about here and now. Not preparing for what it would be like in 20 years.&lt;/p&gt;

&lt;p&gt;It is precisely at times of such uncertainty that great leaps forward are made. Or great mistakes. Only time will tell.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wait, weren't you already independent?
&lt;/h2&gt;

&lt;p&gt;That was a question I got back from a few people whom I told about my decision recently. The answer is, “Yes and No.”&lt;/p&gt;

&lt;p&gt;Helping individuals and small teams become better at what they do through software has been a dream of mine since I was in high school. Yet, I only started to make strides towards it in recent times. About a year ago, I significantly reduced my working hours to focus on it.&lt;/p&gt;

&lt;p&gt;In the beginning, I was full of excitement. I had many plans and many ideas. However, as time went on, a comfortable routine started taking over. I would spend Mondays (and later, Tuesdays as well) working on my things, and the rest of the week at my day job. Then another Monday would come, and I would spend half a day trying to get back to where I had left things the last time.&lt;/p&gt;

&lt;p&gt;The things that I had planned to work on began dragging along. My original vision got blurry, and distracting ideas swamped my brain. Every new week would bring me more “free time” but also more anxiety. I got angry at myself for not achieving what I had initially set upon doing. I even played with the thought of going back to working full-time, but that would mean slamming the door at my dream once again. Perhaps, for good? No! Not without having tried, at least.&lt;/p&gt;

&lt;h2&gt;
  
  
  There’s got to be an end to that
&lt;/h2&gt;

&lt;p&gt;About a few weeks ago, I came to two realizations, which in hindsight, I should have known all along. To start with, &lt;strong&gt;knowledge work is only effective in bursts of absolute focus and dedication&lt;/strong&gt;. It requires a clear mind and a measurable milestone to mark the end of each burst. Make the milestone too vague, or the burst too long, and the initial excitement starts fading away. Likewise, devote too little time and distract the mind all the time, and you will end up in the same hole.&lt;/p&gt;

&lt;p&gt;Second, &lt;strong&gt;reward requires embracing risk and uncertainty.&lt;/strong&gt; For a long time, I had stuck to having a day job primarily out of the necessity to pay bills and help contribute to the household budget. Every change brings disruption, and I am thankful to my wife for going to bear with me through the first few months. I am sure that aiming to do good is going to pay off both morally and financially.&lt;/p&gt;

&lt;p&gt;Third, was this quote from author &lt;a href="http://www.johnbingham.com/" rel="noopener noreferrer"&gt;John Bingham&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“If you run, you are a runner. It doesn’t matter how fast or how far. It doesn’t matter if today is your first day or if you’ve been running for twenty years. There is no test to pass, no license to earn, no membership card to get. You just run.”&lt;br&gt;
Put simply, the way to achieve anything you want in life is to play like you are already there. Being a runner starts with a single step, being a writer with a single sentence, etc. Until about four years ago, I hadn’t run more than a few meters to catch the bus or a closing store. I’ve had a problematic back since I was a teen, which prevented me from competing with most kids my age.  Still, close to reaching 30, quite obese, and with terrible pains in my back, I discovered distance running. I knew I was a runner from the first yards that left my feet swollen like pancakes. About a dozen half-marathons after, I am proof that nothing is impossible if you believe in it,&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A large, if not the crucial portion of running a successful business is playing the role of a successful person from day one. No, I do not mean the lavish lifestyle, but the mindset. To visualize yourself riding a bicycle without the extra wheels. You know what they say &lt;em&gt;perception is reality&lt;/em&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where to now?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fimages.unsplash.com%2Fphoto-1495756650324-e45118cb3e35%3Fcrop%3Dentropy%26cs%3Dtinysrgb%26fit%3Dmax%26fm%3Djpg%26ixid%3DMXwxMTc3M3wwfDF8c2VhcmNofDR8fGRpdmV8ZW58MHx8fA%26ixlib%3Drb-1.2.1%26q%3D80%26w%3D2000" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fimages.unsplash.com%2Fphoto-1495756650324-e45118cb3e35%3Fcrop%3Dentropy%26cs%3Dtinysrgb%26fit%3Dmax%26fm%3Djpg%26ixid%3DMXwxMTc3M3wwfDF8c2VhcmNofDR8fGRpdmV8ZW58MHx8fA%26ixlib%3Drb-1.2.1%26q%3D80%26w%3D2000"&gt;&lt;/a&gt;&lt;br&gt;
Photo by &lt;a href="https://unsplash.com/@jeremybishop?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText" rel="noopener noreferrer"&gt;Jeremy Bishop&lt;/a&gt; on &lt;a href="https://unsplash.com/s/photos/dive?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText" rel="noopener noreferrer"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;See this guy? That is where I am at the moment. I have just let my employer know about my plans, and about the same time filed some paperwork for setting up a sole proprietorship. I will be partially involved with my day-to-day work duties &lt;a href="https://preslav.me/my-availability-in-2021/" rel="noopener noreferrer"&gt;until April, after which I will be on my own&lt;/a&gt;.  After that, my friends, I am ready to help you bring your projects to the next level. I am always on the hunt of discovering the next hidden gem - someone with a great idea but not enough technical expertise to see it live through. Or perhaps, too caught up in the details, Or maybe even someone who wants to get their story told. I am all ears. Or hey, who knows, perhaps I can tell you about my secret plan to fix &lt;strong&gt;podcast discovery&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;If there is one thing that has had tremendous success thanks to the extra free time last year, it was this blog. It gave me a chance to meet and talk to so many of you.  I am more than thrilled about what all these new encounters would lead to in 2021 and beyond!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://calendly.com/p5v" rel="noopener noreferrer"&gt;Book an introductory 30-min call&lt;/a&gt;&lt;/p&gt;

</description>
      <category>personal</category>
      <category>business</category>
      <category>career</category>
      <category>startup</category>
    </item>
    <item>
      <title>Has anyone here written and published an IT book? Please, share your experience.</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Wed, 26 Aug 2020 07:44:38 +0000</pubDate>
      <link>https://forem.com/preslavrachev/has-anyone-here-written-and-published-an-it-book-please-share-your-experience-3mad</link>
      <guid>https://forem.com/preslavrachev/has-anyone-here-written-and-published-an-it-book-please-share-your-experience-3mad</guid>
      <description>&lt;p&gt;I have an idea for a book, and I am willing to sacrifice some time off work to write it. For this, I will need to find a way to cover some of the costs of writing the book, while doing it. &lt;/p&gt;

&lt;p&gt;I am interested to know how this experience might have been for others. How did you measure people's interest to invest in your idea? Is it reasonable to try and set up a crowdfunding campaign up-front, or going straight to a publisher is better? &lt;/p&gt;

&lt;p&gt;Any tips you might offer would be more than welcome.&lt;/p&gt;

</description>
      <category>discuss</category>
      <category>advice</category>
      <category>books</category>
    </item>
    <item>
      <title>My Reply To: The Case Against OOP is Wildly Overstated</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Wed, 05 Aug 2020 06:09:21 +0000</pubDate>
      <link>https://forem.com/preslavrachev/my-reply-to-the-case-against-oop-is-wildly-overstated-4od9</link>
      <guid>https://forem.com/preslavrachev/my-reply-to-the-case-against-oop-is-wildly-overstated-4od9</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published &lt;a href="https://preslav.me/2020/08/03/my-reply-to-the-case-against-oop-is-wildly-overstated/"&gt;on my blog&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;What follows is my reply to &lt;a href="https://medium.com/young-coder/the-case-against-oop-is-wildly-overstated-572eae5ab495"&gt;The Case Against OOP is Wildly Overstated&lt;/a&gt;, by Matthew MacDonald. I support the author with my point of view.&lt;/p&gt;

&lt;p&gt;It took me personally more than a decade of programming, to walk up and down the developer experience ladder:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Just make it work and learn some principles along the way&lt;/li&gt;
&lt;li&gt;Make it work, but follow the principles at all costs&lt;/li&gt;
&lt;li&gt;Don't care if it doesn't work, blindly follow the principles, regardless.&lt;/li&gt;
&lt;li&gt;Punish others who don't follow the principles&lt;/li&gt;
&lt;li&gt;Why are principles stepping in my way all the time? Wasn't I supposed to make something that works?&lt;/li&gt;
&lt;li&gt;Learn how to use principles sparingly. Pragmatically focus on making something that works instead.&lt;/li&gt;
&lt;li&gt;Just make it work.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Problem Origin
&lt;/h2&gt;

&lt;p&gt;Where am I going with this? Like all principles, the idea of Object-Oriented Programming (OOP) originated as a way to guide programmers on how to solve &lt;em&gt;a particular problem&lt;/em&gt;, not as the only way of solving &lt;em&gt;all problems&lt;/em&gt;. The fact that developers occasionally get burned by using it has little to do with OOP, as it has with the human brain's stubborn pursuit of making problems simpler than they are. Of making them fit into a one-size-fits-all shoe box.&lt;/p&gt;

&lt;p&gt;At the time of OOP's origination, codebases started growing in size and complexity. The concept that we nowadays refer to as technology XYZ's &lt;em&gt;Standard Library&lt;/em&gt; or &lt;em&gt;SDK&lt;/em&gt; didn't exist yet. For those to be developed without hundreds of duplications, the need arose for ways to encapsulate common logic and data. That is how classes were born (encapsulation). Classes allowed multiple independent instances (objects) to interact with each other, sharing common functionality through their ancestors (inheritance). To further ease the reuse of code, one provided ways for safely working with objects, without explicitly knowing about the implementation behind their behaviors (polymorphism).&lt;/p&gt;

&lt;p&gt;OOP quickly took the programming world by storm and allowed many of the foundational technologies we rely upon today to get built. It was the pivotal point that led some to believe that it could become the solution for &lt;em&gt;all&lt;/em&gt; programming problems.&lt;/p&gt;

&lt;h2&gt;
  
  
  Applications != Libraries
&lt;/h2&gt;

&lt;p&gt;As the author of the article points out, there is nothing wrong with OOP, when used appropriately, for building the right things. See, one thing that rarely any programming course or practice would teach you is that &lt;em&gt;developing libraries requires a different approach to programming than developing end-user applications&lt;/em&gt;. Libraries need the deep hierarchies and Byzantine-levels of encapsulation, to ensure that the core functionality doesn't change, but gets reused as many times as possible. Libraries have a possibly indefinite number of consumers, and once established, can only be extended by adding functionality that didn't exist before, or by modifying the logic hidden behind the layers of abstraction. Breaking the abstraction requires changing all possible consumers of a library.&lt;/p&gt;

&lt;p&gt;Applications, on the other hand, change all the time. This is their most normal behavior. The problems applications try to solve change all time; on a fundamental level, nothing in the Universe ever stays the same. Why is it then that programming practices teach us to treat applications as if they were libraries? This is a fundamental sin - trying to apply a principle of reusability and component isolation to a problem that will have morphed by the time a solution for it gets drafted. I am not saying that laying out abstractions and sticking to principles is bad. It does help when identifying certain parts of an application that have withstood the test of time. Until then, enforcing principles over a dynamically changing problem seems like trying to catch sunlight in a mirror, not moving the mirror an inch, waiting for the sunlight to fall on it.&lt;/p&gt;

&lt;p&gt;As McDonald concludes, there is a recent proliferation of multi-paradigm programming languages. Ones that feature a subset of OOP without enforcing its use, but allow the programmer to resort to other practices, e.g. Functional Programming (FP) when it better fits the problem at hand. Knowing when to stick to a certain principle, but pragmatically discarding it where it does not apply, is what distinguishes experienced developers from the rest.&lt;/p&gt;

</description>
      <category>programming</category>
      <category>oop</category>
      <category>java</category>
    </item>
    <item>
      <title>In Search of a New Laptop</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Sun, 02 Aug 2020 10:29:03 +0000</pubDate>
      <link>https://forem.com/preslavrachev/in-search-of-a-new-laptop-2nfl</link>
      <guid>https://forem.com/preslavrachev/in-search-of-a-new-laptop-2nfl</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published &lt;a href="https://preslav.me/2020/08/02/in-search-of-a-new-laptop/"&gt;on my blog&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Am I in dire need of a new laptop? No, not really. My 2015 15" MacBook Pro is still perfectly fine and beats many of its 2020 PC contenders. But just like &lt;a href="https://m.signalvnoise.com/back-to-windows-after-twenty-years/"&gt;DHH a few months ago&lt;/a&gt;, I also began searching for alternatives, after getting at odds with some of Apple's decisions.&lt;/p&gt;

&lt;p&gt;Despite all the issues with the Mac product line in recent years (the Dark Age between 2016 and 2020), Apple's computers are still by far the best piece of hardware you can find on the market. Notice that I am not emphasizing speed, performance, resolution, or form factor alone. You can find other machines that excel in one of these dimensions, and suck big time in all the rest. What I am talking about is the entire package. And when it comes to the overall package, Apple blows everyone away.&lt;/p&gt;

&lt;p&gt;If the hardware isn't the problem, then what? Well, the part that's left, namely, the software. Precisely, Apple's &lt;a href="https://developer.apple.com/news/?id=12232019a"&gt;intent on becoming the ultimate gatekeeper&lt;/a&gt; of its software ecosystem. This has been the case on iOS since its inception, so why bother? macOS has only been kept free from the grip of the Apple machinery for as long as it was needed to bake in the safety measures, but one can see the inevitable happening with every new release. The more macOS and iOS (iPad OS) converge on features and codebase, the easier it would become to pull the plug on all 3rd-party apps unless they get properly sandboxed and distributed through the App Store.&lt;/p&gt;

&lt;p&gt;Is this that bad? I mean, gate-keeping apps will lead to improved safety, right? True, and well, unless it starts interfering with my workflow. I tend to consider myself a pro user. Although I have switched to doing much of my software development on a remote server quite some time ago, I still need the necessary tooling to do my work well. Plus, as a pro user, I have built my workflow around a set of routines and apps that allow me to do things faster and more efficiently than the average macOS user. I am not saying that all this would suddenly disappear if macOS and iOS converge. I fear that it would bring an additional burden on the developers of those tools, making the stakes too high for some to keep playing the game. Ever since the split between iOS and iPad OS, there has been a surge of more advanced productivity and development apps for the iPad, demonstrating a case, where it might be possible to do software development solely on the iPad. I've tried it and it works, but we not there yet. Just, not there yet.&lt;/p&gt;

&lt;h1&gt;
  
  
  What are the alternatives?
&lt;/h1&gt;

&lt;p&gt;As I have mentioned, much of my development work happens on a remote machine these days. I don't need top-of-the-line performance, as much as I care about a premium build, minimalist form-factor, great display/keyboard, and a guarantee that five years from now, I could still walk around with this laptop and use it as my daily driver.&lt;/p&gt;

&lt;p&gt;Like DHH, I have been checking the alternatives. Using Linux on the desktop full-time is not a thing I am after (for now). No hard feelings, working on Linux is what I earn my living with, but Linux on the desktop can get fiddly at times, especially, when fiddling with your OS is the last thing you have time for. What's left then is either trying Windows again (with the WSL), or keeping my mouth shut and keeping with macOS. After ten years within the Apple ecosystem, thinking about going back to Windows sounds both scary and exciting. For one, I have been pleased to see how much more developer-friendly the "new" Microsoft has become. I know a thing or two about the MS developer experience from my former life as a .NET developer, and it is leagues and bounds better than what it used to be. On the other hand, though, going back to Windows with all of its fragmentation, and myriad of options varying quality standards feels off-putting at first. To quote DHH:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;[But] for me, this just wasn’t worth it. I kept looking for things I liked about Windows, and I kept realizing that I just fell back on rationalizations like “I guess this isn’t SO bad?”. The only thing I really liked was the hardware, and really, the key (ha!) thing there was that the keyboard just worked. It’s a good keyboard, but I don’t know if I’d go as far as “great”. (I still prefer travel, control, and feel of the freestanding Apple Magic Keyboard 2).&lt;/p&gt;

&lt;p&gt;[...]&lt;/p&gt;

&lt;p&gt;Windows still clearly isn’t for me. And I wouldn’t recommend it to any of our developers at Basecamp. But I kinda do wish that more people actually do make the switch. Apple needs the competition. We need to feel like there are real alternatives that not only are technically possible, but a joy to use. We need Microsoft to keep improving, and having more frustrated Apple users cross over, point out the flaws, and iron out the kinks, well, that’s only going to help.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Let me finish this monologue with something of a conclusion. Although I am not particularly looking for a new machine right now, I am genuinely interested in seeing what other options there are on the market. Once Apple brings out the new ARM-based laptops, it would be hard not to think of investing in one of them, but hey, if anyone could convince me of a great alternative, I would love to hear about it.&lt;/p&gt;

</description>
      <category>apple</category>
      <category>laptop</category>
      <category>windows</category>
      <category>linux</category>
    </item>
    <item>
      <title>Getting Your Bear Notes Productivity to the Next Level Using Alfred</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Tue, 30 Jun 2020 15:36:54 +0000</pubDate>
      <link>https://forem.com/preslavrachev/getting-your-bear-notes-productivity-to-the-next-level-using-alfred-42m</link>
      <guid>https://forem.com/preslavrachev/getting-your-bear-notes-productivity-to-the-next-level-using-alfred-42m</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published &lt;a href="https://preslav.me/2020/06/29/getting-your-bear-notes-productivity-to-the-next-level-using-alfred/"&gt;on my blog&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;A tool is great if it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;does one thing, but does it well&lt;/li&gt;
&lt;li&gt;mixes well with the other tools in one’s toolbox&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://bear.app/"&gt;Bear&lt;/a&gt; is one of the greatest note-taking applications, available for the Apple ecosystem. A big part of what makes it so great is its clean and simplistic user experience. I have tried switching to other tools such as &lt;a href="https://www.notion.so/"&gt;Notion&lt;/a&gt;, and lately, &lt;a href="https://roamresearch.com/"&gt;Roam Research&lt;/a&gt;, but I always come back to Bear. The simplicity of writing in plain Markdown, combined with the intuitive organisation of notes thanks to nested tags, make for a powerful combination. One that allows anyone to start with a simple note, and expand it to a complete digital map of their brain.&lt;/p&gt;

&lt;p&gt;Simplicity means that a tool won’t always be able to answer 100% of one’s specific needs. But that’s totally OK, as long as achieving what one needs is possible through other means. Since I am using Bear as my go-to knowledge database, being able to get quick answers is crucial. While Bear does have an integrated search functionality, switching back and forth between applications can be a serious context switch.&lt;/p&gt;

&lt;p&gt;Thus, I was super happy to find out about &lt;a href="https://github.com/drgrib/alfred-bear"&gt;this Alfred Workflow&lt;/a&gt;, which allows for searching one’s entire Bear database directly in &lt;a href="https://www.alfredapp.com/"&gt;Alfred&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZlYCnGBm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/drgrib/alfred-bear/raw/master/doc/BasicSearch.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZlYCnGBm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/drgrib/alfred-bear/raw/master/doc/BasicSearch.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One can search for anything, including filtering by hashtags, or any of the special &lt;code&gt;@&lt;/code&gt; -style tags in Bear (&lt;code&gt;@today&lt;/code&gt;, &lt;code&gt;@todo&lt;/code&gt;, etc). Additionally, one explicitly start a new note, by typing &lt;code&gt;bn&lt;/code&gt; and follow with the note’s title. &lt;br&gt;
More than anything, the workflow is completely 100% open-source, built in Go, and by a helpful developer (&lt;a href="https://github.com/drgrib"&gt;Chris Redford&lt;/a&gt;), who is always open to new contributions, ideas, and feature suggestions. I hope to have been helpful (and not too annoying) to Chris in the development of this workflow, if not on the coding side, at least, with suggestions and feedback.&lt;/p&gt;

&lt;p&gt;If Alfred and Bear are two apps you use on a daily basis, you absolutely must &lt;a href="https://github.com/drgrib/alfred-bear/releases"&gt;download&lt;/a&gt; this. It will let your productivity skyrocket.&lt;/p&gt;

</description>
      <category>productivity</category>
      <category>bear</category>
      <category>alfred</category>
      <category>tools</category>
    </item>
    <item>
      <title>Reflecting on My Experience With Go, One Year After</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Sun, 09 Feb 2020 16:23:39 +0000</pubDate>
      <link>https://forem.com/preslavrachev/reflecting-on-my-experience-with-go-one-year-after-nl8</link>
      <guid>https://forem.com/preslavrachev/reflecting-on-my-experience-with-go-one-year-after-nl8</guid>
      <description>&lt;p&gt;&lt;strong&gt;NOTE:&lt;/strong&gt; &lt;em&gt;This post originally appeared on &lt;a href="https://preslav.me/2020/01/17/reflecting-on-my-experience-with-go-one-year-after/"&gt;my blog&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;In my little more than a year day-to-day developer experience with Go, I have so far &lt;strong&gt;learned three things&lt;/strong&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;I can change my entire view of how programming works, even after 12+ years of doing it in one form or another.&lt;/li&gt;
&lt;li&gt;People are blaming Java for all the wrong reasons&lt;/li&gt;
&lt;li&gt;People are praising Go for all the wrong reasons&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let me explain. This post is not about saying that “language A is better than language B”, or vice versa. It is about asking ourselves the question why things work the way they do, and whether doing them differently is a bad, or potentially, good thing.&lt;/p&gt;

&lt;p&gt;Before joining the Go camp, I had worked quite a few years as a Java developer, with all the stereotypes that this role could evoke in one’s head. I co-developed data-processing systems for various industries. Yet, much of the code I wrote was just boilerplate: passing data from one format to the other, or devising complex abstractions behind what should have really just been calling a function and obtaining its result. Yes, the code was difficult to comprehend, but I was proud of it for this exact reason. The more hoops I created, the more secure I felt that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;I was doing what I thought was right&lt;/li&gt;
&lt;li&gt;If people didn’t understand the code, they’d have to come to me for an advice, further boosting my ego.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The language is not to blame for this
&lt;/h2&gt;

&lt;p&gt;The fact that much of the existing Java code is full of bureaucracy has nothing to do with the language itself, or with its platform. Our developer community should bear the sole responsibility. I can assure anyone that perfectly functioning Java applications can be written without 90% of the ceremony. They will be smaller and run faster. Most probably, easier to comprehend, too. And yet, they won’t get you hired in any well-respected company. They just won’t pass the &lt;em&gt;developer prejudice&lt;/em&gt; test. I know. I’ve seen many elegant solutions and rejected them for not being /idiomatic/ enough.&lt;/p&gt;

&lt;h2&gt;
  
  
  Go isn’t a silver bullet either
&lt;/h2&gt;

&lt;p&gt;For much of the same reasoning, jumping ship towards Go, just because “it is not Java”, won’t bring anyone far. Even before I started writing Go, I had heard and read many stories, about how simple and fast it made everything, how little ceremony it had, compared to Java, how it would eventually kill all other languages, etc. All blah, blah. Despite all of the above being true, you have to discover the truth in each for yourselves. If you approach the language out of desperation with your current way of working, you’re going to be set for a rough path.&lt;/p&gt;

&lt;p&gt;See, if you all you wanted was to get a faster running (name language of choice), you could certainly do it. Yet, holding on to the mental baggage of your previous experience will be hard and messy. My first Go project started out as a rewrite of a Spring Boot app I’d started earlier, so I thought I’d just organize it the same way. To keep the story short, let’s just say it was a spectacular disaster. Only after I started from scratch, did it really start taking off.&lt;/p&gt;

&lt;h2&gt;
  
  
  Go is a language without (with less) idioms
&lt;/h2&gt;

&lt;p&gt;Let’s do a naive math experiment. Imagine that you could create valid programming expressions combining any 3 keywords, from a programming language’s vocabulary. Thus, if a language only has 10 keywords, the maximum number of possible expressions is 10 * 9 * 8 = 720. In contrast, a language, with, say, 20 keywords would end up having 20 * 19 * 18 = 6840 expressions. Twice as many keywords would result in almost 10 times as many expressions!&lt;/p&gt;

&lt;p&gt;Languages tend to encourage the creation and use of idioms. With that many possible expressions, it’s a normal behaviour for an individual, or a group of people to start associating and using expressions for certain things. The problems usually occur when another group comes with its own way of expressing the same thing. Both are perfectly valid, but each group would have issues understanding the other.&lt;/p&gt;

&lt;p&gt;This is not to say that Go having a very strict and concise nature, is totally devoid of idioms. That would be impossible. It is in our nature to try to associate and abstract certain concepts. Yet, when a language has a deliberately smaller vocabulary, the chances for different groups accidentally finding multiple ways of doing the same thing are smaller. This helps the communication between people a great deal, but comes with a very obvious downside. Code (or any written expression, for that matter) without idioms is very, very verbose.&lt;/p&gt;

&lt;p&gt;So, whoever told you that Go is not a verbose language, probably either lied to you on purpose, or had’n really seen any other programming languages up until that point. But hey, we agreed that verbosity in the name of communication and common understanding is a actually a good thing, right?s&lt;/p&gt;

&lt;h2&gt;
  
  
  Go is a test for senior engineers
&lt;/h2&gt;

&lt;p&gt;A lot has been said about the initial concept about Go, and how the idea was to design a language for juniors fresh out of college, and with little programming experience. I think that understanding the beauty of going back to the roots of programming, can be a cathartic experience for many seasoned programmers.&lt;/p&gt;

&lt;p&gt;See, junior programmers start with little baggage and preconceptions, so in their view, anything that can be done with code is fair and justified. Including, burning a CPU, or erasing a disk due to an arithmetic error.&lt;/p&gt;

&lt;p&gt;Somewhere along the middle of the career path, a bunch of principles start to pile up. All of them out of the desire to step on what’s already been learned, and to make sure that things are smoothly and safely without immediate supervision. Learning and applying the principles is great, because it ensures a gradual path forward. But for many, it becomes a dogma which they blindly stick to, without asking whether a simpler alternative could be better.&lt;/p&gt;

&lt;p&gt;The problem with principles is that they only work well around 80% of the time. It is the remaining 20% that can be disastrous for a project, or for one’s career. It is the understanding where to apply a principle, and where to deliberately throw it away in the name of pragmatism, which turns a software engineer into a senior software engineer.&lt;/p&gt;

&lt;p&gt;To really appreciate Go, one needs to learn how to discern what makes it and its community stand out from the rest. One needs to go through a phase of utter disgust with the language, for it “lacking” certain feature. Moving on despite the urge to go back to a familiar ground, would result in one of two things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Make one realise that indeed, the Go language is not what they need or want&lt;/li&gt;
&lt;li&gt;Learn to appreciate going back to the roots, as well as when to favour pragmatism over principles&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In any case, it would be an interesting experience.&lt;/p&gt;

</description>
      <category>go</category>
      <category>java</category>
      <category>2cents</category>
    </item>
    <item>
      <title>A Crystal in Go’s World</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Sat, 01 Feb 2020 18:48:13 +0000</pubDate>
      <link>https://forem.com/preslavrachev/a-crystal-in-go-s-world-2g5e</link>
      <guid>https://forem.com/preslavrachev/a-crystal-in-go-s-world-2g5e</guid>
      <description>&lt;p&gt;&lt;em&gt;This post was originally published on my &lt;a href="https://p5v.me/2020/01/a-crystal-in-go-s-world/"&gt;blog&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Imagine a programming language with the ergonomic syntax of Ruby, the speed of execution of C, the concurrency model of Go, and last but not &lt;br&gt;
least, a compiler that performs null checks at compile time. Sounds like&lt;br&gt;
a dream? Well, this language exists, but chances are, you haven’t heard of it so far. &lt;/p&gt;

&lt;p&gt;Meet &lt;a href="https://crystal-lang.org/"&gt;Crystal&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bVcg6wMU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.shortpixel.ai/client/to_webp%2Cq_glossy%2Cret_img%2Cw_882/https://p5v.me/wp-content/uploads/2020/01/image.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bVcg6wMU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn.shortpixel.ai/client/to_webp%2Cq_glossy%2Cret_img%2Cw_882/https://p5v.me/wp-content/uploads/2020/01/image.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Crystal is all of the above, plus it has types, &lt;a href="https://crystal-lang.org/reference/overview/"&gt;outstanding documentation&lt;/a&gt;, and a strong community, delivering a steady stream of new libraries (a.k.a “shards”). Don’t get fooled by the current version number (0.32.1). Crystal has been around for quite a few years (since 2012) and has a mature set of language features and an ecosystem of good libraries.&lt;/p&gt;
&lt;h2&gt;
  
  
  Where does the speed come from?
&lt;/h2&gt;

&lt;p&gt;Crystal produces fast and lightweight native applications using the LLVM infrastructure. When I say fast, I mean, &lt;a href="https://github.com/kostya/benchmarks"&gt;really fast&lt;/a&gt;. Take the fastest Go code you can find and chances are, the same code in Crystal will perform at least on par with it, and often quite a bit faster. Measuring Crystal’s performance against that of Ruby makes no sense.&lt;/p&gt;

&lt;p&gt;There are no runtime frameworks or virtual machines necessary. One can just grab the compiled binary and deploy it. When compared with deploying and running a Ruby application, this feels like a whole different league.&lt;/p&gt;

&lt;p&gt;Note that there are some caveats, which I am going to discuss in a future blog post. For now, let’s just say that building and distribution are equally as easy, as those in Rust. As of yet, nothing can beat the Go compiler speed-wise, but my experience with the Crystal tooling has been more than pleasant so far.&lt;/p&gt;
&lt;h2&gt;
  
  
  CSP-style concurrency
&lt;/h2&gt;

&lt;p&gt;One of the things that make Go so interesting is its concurrency model. The idea about goroutines that communicate via channels is based on an approach dating back to the late 1970s, called &lt;a href="https://en.wikipedia.org/wiki/Communicating_sequential_processes"&gt;Communicating Sequential Processes (CSP)&lt;/a&gt;. Crystal uses an analogous approach. Programs run in what is known as &lt;a href="https://crystal-lang.org/reference/guides/concurrency.html"&gt;“fibers”&lt;/a&gt;. The main fiber can spawn any number of concurrent fibers that send and receive data via blocking channels.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight crystal"&gt;&lt;code&gt;&lt;span class="n"&gt;channel&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Channel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="no"&gt;Nil&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;

&lt;span class="n"&gt;spawn&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;   
    &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Before send"&lt;/span&gt;   
    &lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kp"&gt;nil&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;   
    &lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"After send"&lt;/span&gt; 
&lt;span class="k"&gt;end&lt;/span&gt; 

&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Before receive"&lt;/span&gt; 
&lt;span class="n"&gt;channel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;receive&lt;/span&gt; 
&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"After receive"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Why re-invent Ruby in 2020?
&lt;/h2&gt;

&lt;p&gt;The creators of Crystal obviously didn’t intend on changing the world of programming by creating a new language. They just &lt;a href="https://web.archive.org/web/20181126095302/https://manas.tech/blog/2016/04/01/the-story-behind-crystal.html"&gt;loved Ruby&lt;/a&gt; and felt it sad to leave it for a more performant and type-safe alternative. Due to a series of trade-offs at the implementation level, Ruby is still slower and more memory-hungry than its competitors. Despite perfectly serving the needs of a large segment of Web users through Rails, its performance puts it at the back of the pack, when it comes to other use cases.&lt;/p&gt;

&lt;p&gt;The point is fair and valid. As a language, Ruby has a concise and elegant syntax for writing. Once beyond the basic idioms, writing Ruby evokes pure joy. Crystal brings that joy to an even higher level through type-safety, native speed, and extremely simple concurrency model.&lt;/p&gt;

&lt;p&gt;Don’t get me wrong, I like Go too, precisely because of its verbosity and lack of idioms. When working with others on a big project, I’d prefer more ceremony and hoops, in the name of transparency and equal code comprehension. Different languages exist to serve different purposes and be used by different groups of people. The trick is knowing when to use and when the other.&lt;/p&gt;

&lt;h2&gt;
  
  
  So, is Crystal worth having a look?
&lt;/h2&gt;

&lt;p&gt;Absolutely! If only to know that it exists and keep an eye on it, I’d go check it out and write a few applications with it. Whether Crystal will take off in the future is a bit more difficult to say, however. As mentioned, the 99% resemblance to Ruby is nice, and so is the blazing-fast performance. Yet, I am missing the Crystal community’s drive towards more prominence. There has been a long-awaited move towards a 1.0 release, which is a crucial milestone and would surely bring in many newcomers. To my understanding, the language and its tooling are stable enough for a 1.0 release.&lt;/p&gt;

&lt;p&gt;I understand that Crystal does not have the backing of either Google or Mozilla. Neither does it have multi-billion-dollar use-cases to put on its home page. I understand that fighting for the same space with Go, C/C++, and Rust is an unfair battle. Yet, I also believe that we’re long past the days when choosing one technology over another was a zero-sum game. All it needs is a little push.&lt;/p&gt;

&lt;p&gt;I am hoping for the best!&lt;/p&gt;




&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/m8bYqfrGjGo"&gt;
&lt;/iframe&gt;
&lt;br&gt;
&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/KU1tAjP9Xk0"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

</description>
      <category>crystal</category>
      <category>ruby</category>
      <category>go</category>
      <category>language</category>
    </item>
    <item>
      <title>What (if any) is your preferred auto-formatter for Python?</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Sat, 26 Oct 2019 16:46:06 +0000</pubDate>
      <link>https://forem.com/preslavrachev/what-if-any-is-your-preferred-auto-formatter-for-python-3b5f</link>
      <guid>https://forem.com/preslavrachev/what-if-any-is-your-preferred-auto-formatter-for-python-3b5f</guid>
      <description>&lt;p&gt;This is more of a discussion poll than an actual post. I am writing a blog post about auto-formatting options for Python, and wanted to source a little bit of intel from the community itself.&lt;/p&gt;

&lt;p&gt;I'd really love to know which one is your favorite and why. If you don't want to bother writing at length, I have a created a &lt;a href="https://twitter.com/preslavrachev/status/1188112156044214274"&gt;Twitter poll&lt;/a&gt;, where you can simply choose your fave.&lt;/p&gt;

&lt;p&gt;Thanks!&lt;/p&gt;

</description>
      <category>python</category>
      <category>programming</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Good Code is Boring</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Sun, 22 Sep 2019 15:27:57 +0000</pubDate>
      <link>https://forem.com/preslavrachev/good-code-is-boring-1ikc</link>
      <guid>https://forem.com/preslavrachev/good-code-is-boring-1ikc</guid>
      <description>&lt;p&gt;You have seen them many times. Small snippets of code and the question &lt;strong&gt;&lt;em&gt;“What would the the following piece of code print&lt;/em&gt;&lt;/strong&gt;” beneath. You might have tried taking a guess, and perhaps, even failed. &lt;/p&gt;

&lt;p&gt;Liquid error: internal&lt;/p&gt;

&lt;p&gt;I find those kinds of questions utterly pointless. They not only teach you &lt;strong&gt;nothing&lt;/strong&gt; about real-world programming, but might also lead many newcomers to long-term frustration with programming. &lt;/p&gt;

&lt;p&gt;All programming languages have baggage - obscure features that made it in the spec but were later deemed as &lt;strong&gt;hacks&lt;/strong&gt; that should be avoided. There is a certain sense of pride that junior programmers feel when they find such hacks and use them to demonstrate problem-solving skills. I am all in favor of encouraging developers to be proactive and think out of the box. Yet, I often try to point out that using questions like the above as a way of judging one's skills is the wrong way to do it. The mere fact that an opportunity for such questions exists in the first place, should make one take a skeptical look at the language itself. &lt;/p&gt;

&lt;p&gt;Contrary to what your teacher taught you, real-world programming is all but proving yourself at solving complex riddles. Much of the well-written production code I have seen is pretty trivial and boring-looking like. This makes it easy to follow and maintain years down the road, once its original creator no longer works on it.&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>advice</category>
      <category>2cents</category>
    </item>
    <item>
      <title>What aspect of Go were you at odds with, coming from a different tech stack?</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Fri, 29 Mar 2019 20:06:33 +0000</pubDate>
      <link>https://forem.com/preslavrachev/what-aspect-of-go-were-you-at-odds-with-coming-from-a-different-tech-stack-2id5</link>
      <guid>https://forem.com/preslavrachev/what-aspect-of-go-were-you-at-odds-with-coming-from-a-different-tech-stack-2id5</guid>
      <description>&lt;p&gt;Recently, I added the Go language to my programming tool belt. Yet, as much as I want to give the language more and more chances, it hasn't really been a shiny success story so far. A lot of things in how one writes Go programs are just different. And I mean, not only syntax-wise. I have spent quite a lot of time digging into the philosophy of the language, and guiding principles of the founding team, and I have to say, for the most part, I do agree with them on 100%. Go isn't exactly a beautiful and fluent language in terms of its level of syntax sugar and this is supposed to be a feature and not a bug. Having witnessed the evolution of Java over the past close to a decade, I have quite learned to like the verbosity and ceremony. As much as it hurts to write code, it makes its long-term comprehension and maintenance much easier. &lt;/p&gt;

&lt;p&gt;And yet, I can't make myself appreciate Go as much as I want to. There are some things that cross my principles, and I can't decide whether to believe and accept that this is an ingenious way to do things, or cross them out as utterly stupid. Like, when you'd pass an argument to a function and expect the return to be a modified version, but instead, it was the argument itself that got modified. I know that these are just things that I could simply avoid doing, but it is difficult not to stumble upon them everywhere, because they are part of what is known as "idiomatic Go".&lt;/p&gt;

&lt;p&gt;Maybe it's just me, or maybe not. I did not want to open up an argument, but just to sort of figure out what struck others as odd, when approaching the Go language from a different domain.&lt;/p&gt;

</description>
      <category>discuss</category>
      <category>go</category>
    </item>
    <item>
      <title>Don't Build Your Products Around Technologies. Build Technologies Around Your Products</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Fri, 18 Jan 2019 05:25:53 +0000</pubDate>
      <link>https://forem.com/preslavrachev/dont-build-your-products-around-technologies-build-technologies-around-your-products-536g</link>
      <guid>https://forem.com/preslavrachev/dont-build-your-products-around-technologies-build-technologies-around-your-products-536g</guid>
      <description>&lt;p&gt;&lt;em&gt;This post was originally published on my &lt;a href="https://preslav.me/2019/01/18/dont-build-your-products-around-technologies/"&gt;blog&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;I had somehow assumed that this is something people learn, once they get burned by a few incidents. Yet, time and again, reality finds its ways to prove me wrong. So, I am compelled to say it once again:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Don't build your products around technologies. Build technologies around your products.&lt;/em&gt;  &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There was a time when I could swear my allegiance to any tool, framework, or library which happened to be at the heyday of public attention. I would toy around with it, talk about it, proclaim it to be the one and only solution to any problem, or use it to design solutions for problems I wouldn't really have. That is, until the company/community behind it, lost interest in moving the technology forward. The technology would quickly lose its coolness factor, die out of public interest, and get replaced by a new one, supposedly even cooler. Usually, it goes like this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Company ABC has a product problem&lt;/li&gt;
&lt;li&gt;ABC comes up with a clever solution: XYZ&lt;/li&gt;
&lt;li&gt;ABC decides to open-source XYZ&lt;/li&gt;
&lt;li&gt;XYZ becomes an overnight hit&lt;/li&gt;
&lt;li&gt;Other companies adopt XYZ for their products&lt;/li&gt;
&lt;li&gt;With so many use cases, XYZ eventually becomes a burden to maintain&lt;/li&gt;
&lt;li&gt;ABC gives up and moves along&lt;/li&gt;
&lt;li&gt;XYZ dies out or gets replaced by a new shiny toy&lt;/li&gt;
&lt;li&gt;Parts of XYZ eventually make it to a standard, but years after its fall from grace&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This cycle would repeat a few more times over the following years. And every time the same. A technology would appear from nowhere, quickly become overhyped, only to die out and get replaced soon thereafter. It becomes a burden. And, while refactoring is fun the first few times, the bigger and more mature my work got, the more difficult it became to justify the costs of eventually having to get rid of this burden, once it died out.&lt;/p&gt;

&lt;p&gt;This is the reason, why nowadays, I would try to stick to a stack that first-and-foremost, solves the concrete problem at hand, and second, steps on the shoulders of standards. Indeed, my weapons of choice might not always be the fancy ones that one reads about on Medium. They might not have an interactive documentation site or a vocal Twitter community, but they get the job done, and they have proven themselves over the years. As for the fancy ones, I do play with new things every day, and many of them find their way in my projects. Yet, just like in investing, a certain level of risk assessment is needed. This is where the good architecture comes into play, clearly distinguishing components from one another, and abstracting dependencies away behind generic API contracts.&lt;/p&gt;

&lt;p&gt;Final Note: A great technology is one that helps you solve a particular problem without becoming the centerpiece of your product. It conforms to standards and is thus easily swappable with any other one that does the job better. For those that don’t, it’s your responsibility to abstract them away with code that does. Don’t sugar-coat a piece of technology just because it’s nice to play with, or because the community tells you so. Or in other words: don’t put all of your bets on one thing, but rather, build your product as a collection of many loosely-connected things. Let new libraries, tolling, and frameworks sprinkle out of your particular needs instead. This is how standards get born.&lt;/p&gt;

</description>
      <category>dev</category>
      <category>culture</category>
      <category>technology</category>
      <category>practice</category>
    </item>
    <item>
      <title>Do you still use a macOS app launcher?</title>
      <dc:creator>Preslav Rachev</dc:creator>
      <pubDate>Tue, 15 Jan 2019 17:02:56 +0000</pubDate>
      <link>https://forem.com/preslavrachev/do-you-still-use-a-macos-app-launcher-2npd</link>
      <guid>https://forem.com/preslavrachev/do-you-still-use-a-macos-app-launcher-2npd</guid>
      <description>&lt;p&gt;There was a time, when I could swear my allegiance to one app launcher or another, as long as it was not Spotlight. At first, I believe, I started using the free version of &lt;a href="https://www.alfredapp.com/"&gt;Alfred&lt;/a&gt; some years ago, but then I moved over to &lt;a href="https://obdev.at/products/launchbar/index.html"&gt;Launchbar&lt;/a&gt; when I realized  that Alfred still pretty much relies on the Spotlight index for everything. At the time, I had manually &lt;a href="https://www.iclarified.com/49187/how-to-disable-and-reenable-spotlight-indexing-on-your-mac"&gt;knocked out&lt;/a&gt; the Spotlight indexing process, in order to save CPU cycles and spare my HDD (yeah, it was quite some time ago). Launchbar builds its own index and it is very fast and fine-grained to exactly an app launcher would need.&lt;/p&gt;

&lt;p&gt;I am still used Launchbar as my day-to-day app launcher, even though it has been years since I stopped disabling the Spotlight indexing process. I still don't like the Spotlight UI, but am aware that the index is used in many more apps beyond the app launching itself.&lt;/p&gt;

&lt;p&gt;I am interested to know if you are still using a custom app launcher for macOS. I f yes, I would like to hear your reasons, and perhaps, get a few new interesting automation tips from you. Drop me a line in the comments.&lt;/p&gt;

</description>
      <category>discuss</category>
      <category>macos</category>
      <category>productivity</category>
      <category>tips</category>
    </item>
  </channel>
</rss>
