Kushal Das4

FOSS and life. Kushal Das talks here.

Switching to Emacs land

After 16 years with Vi(m), a week back I switched to Emacs as my primary editor. I used Emacs for a few days in 2010, when it was suggested to me by the #lisp channel. But, neither I continued, nor I understood the keystrokes well.

But, why now?

I was trying out Org Mode in Vim. Looking at the strange key-combinations, I felt it is not for me. I decided to give Emacs a proper try for the first time. If you know about our summer training, or saw me discussing about GNU/Linux world in any college, you would have seen me to suggest vim to everyone as a starting point. Mostly because of two reasons.

  • It is there on default Fedora installation.
  • I still think it is easy enough for the beginners to start with.

For me, to learn anything new, I prefer to use it regularly. Let it be a programming language, or a particular tool. I wanted to see how Org Mode works in Emacs, but, to do so well I had to use it myself. That also means I will have to use Emacs regularly.

How did I start?

I used a few different sources to start reading, and also configuring my init.el file. Shakthi Kannan has some excellent articles on Emacs. I also found another site which introduces Emacs and related configurations well. I am going to suggest both sites to anyone starting with Emacs. Shakthi also has a very good reference card for the keystrokes.

In the #dgplug IRC channel, maxking suggested to start using eshell and magit inside of Emacs. I am using the presentation from Shakthi to learn magit.

The last few blog posts (including this one), and also a few commits in random places were done using this setup (inside of Emacs).

The problems

I am not going to say this was very smooth. Remembering the keystrokes is always difficult. Getting them into the muscle memory is even more time consuming. But, there are few things which I found really difficult (for me).

  • Marking for cut/copy/paste.
  • Sometimes I see the same buffer in two split areas, just can not close them easily.
  • Spell check keystrokes

All the other regular Emacs users I know, are using it for more than 10 years. I am not trying to hurry, I will slowly learn the various facts and HOWTOs. The famous XKCD post fits well in this regard.

Story of mashing in Bodhi (Fedora updates system)

Bodhi mash

This is mostly Fedora release engineering work specific post. Feel free to skip if you are not into Fedora land.

I started packaging for Fedora from 2006, back in Fedora Extras repository days. Now, we have only one big repo. Packagers can just submit their new packages or updates to this repository. Now, to do so, first, we will have to get the package reviewed (in the case of new packages). Later, after getting a git repo for the spec file, one can build the package in Koji, this will create the RPMs and the SRPM in the koji server. Next, we mark the package for a push to the testing repository (or after certain criteria, mark them to push to the stable repository). For rawhide (the latest of everything) repository, we don’t have to do this, everything we build for rawhide will be automatically pushed.

What happens after we mark the package ready for push?

Every day, someone from the Fedora release engineering team is on push duty. This person calls bodhi-push commands, which in turn generates a fedmsg, which then consumed by a the consumer, defined in bodhi masher.py to generate new repos from the updates, and also mark the packages properly in koji. It also composes the ostree tree. We will now dig into the source of this command to see what all happens.

The work starts at line 271 At line 290, it initiates the state. There are use cases when it resumes from the middle of a previous mash, the if-else block at line 301 helps to load or save the state of the mash accordingly. To load or save it actually reads up a JSON file and a few other steps.

Next, at line 306, it calls a function load_updates. This function talks to the database and finds out the list of packages which need to be pushed (the final repo will be generated from all the packages in that particular tag) for updates. Now all of this work happens based on a unique tag, like f25-updates. The verify_updates function makes sure that list of the updates has the right updates for the current tag (it can happen that someone tries to mark an F24 build as the F25 update).

We have to do some more checks if this push is for stable repositories, like if the build has enough positive karma, or has it spent 7 days in the testing repository. At line 310, the perform_gating function does these gating checks.

At line 312, in the determine_and_perform_tag_actions functions, we talk to Koji and update the tags in Koji as required. Either it adds a new tag or moves the build to a new tag. Like, from candidate to the testing. After that we update the bugzilla bug entries if there are any update with a bug associated, and in the next few lines we remove any extra tags, and also update the comps files (from the git repo). At line 321, we create a new thread which does the real mash work using the mash command from the package with the same name. While we wait for this thread to finish on line 330, we have already created some digest mail information, and other updateinfo (in the uinfo variable, which is the content of updateinfo.xml). In line 337 we check if we are supposed to do build ostree repos for atomic, if yes, then we call the compose_atomic_trees function.

Then we sync the newly generated repo with the master mirror, and wait for it to finish (on line 345). After that it sends out fedmsg notifications, modifies the bugs in the bugzilla, adds comments to the koji updates, and also does the announcement emails (the function calls are documented).

The main reason I wanted to write this post is the upcoming change where we will stop using the mash command, and instead call pungi to do the work. As pungi can also do ostree tree compose.

I want to specially thank Patrick who helped me to understand this piece of code (as he does for many other things).

Touch Typing

Learning new things, is an essential part of life. While we try to spend a lot of time learning various tricks about our tools or a particular programming language, many newcomers miss another important common skill.

The art of Touch Typing.

Ever since, I started going to conferences, I met many people who do touch type, and they generally type really fast. I found that to be very common, in our circles. But, when we meet beginners and discuss the things they should learn, we completely miss talking about this point. Most of the beginners I’ve met can’t type well. And most of the errors people ask about, are caused by, guess what?

Typing mistakes.

Also, because they type very slow, beginners lag behind in workshops.

GNU Typist

In the dgplug summer training, just about after we’re done with the communication sessions, we ask people to start spending some time learning to type.

My favorite tool is GNU Typist. It’s a small command line tool which can help anyone learn touch typing in a few days. Remember that the package name is gtypist.

In the main menu, you choose one of the many courses shown. The “Quick QWERTY” course is powerful enough to give you a start. After a few screens of description, about how to use the tool, you get into a screen like the one shown below.

As you see, any error will be marked by the tool. In the beginning, it is okay if you keep checking where your fingers are. If you spend a week with this tool, you should be able to start typing faster, and with fewer errors. Your muscle memory will kick in, and you’ll will be amazed by your new super power :) When I first used the Das Ultimate (no relation :P) or the Kinesis Advantage keyboard, I spent the first few minutes in gtypist to become familiar with them.

KDE Touch

KDE Touch is a GUI application to learn how to type. It will give you similar levels of detail, using various pretty looking graphs & charts. If you do not like the command line tools, you can always start learning using this tool.

There is also the Tux Typing tool, which is aimed more at kids. The tool shows various words and if you can type them properly, you will be able to provide to food to the nice little penguin.

People who are reading this, most probably will spend the rest of our lives in front of computers (for various reasons). Learn to type well; having that muscle memory is a powerful tool and will be a be source of great strength for you.

Installing Python3.6.1 in your Fedora24/25

Yesterday, one of the participant in the dgplug summer training was trying to use Python3.6.1 in the Fedora 24 box. There was some issues in the installation, I actually don’t know how the installation was done. So, I just suggested to build Python 3.6.1 from source. Then have as many virtual environments as required to learn Python.

The following commands can help anyone to build from source on Fedora 24 or on Fedora 25.

$ sudo dnf install dnf-plugins-core wget make gcc
$ sudo dnf builddep python3
$ wget https://www.python.org/ftp/python/3.6.1/Python-3.6.1.tgz
$ ,tar -xvf Python-3.6.1.tgz
$ cd Python-3.6.1
$ ./configure --with-pydebug
$ make -j2

After the above command, you will have a python binary. Now, next step is to create a virtual environment.

$ ./python -m venv myenv
$ source myenv/bin/activate
(myenv)$ python

Now you have your own Python 3.6.1. Happy hacking :)

Looking back at the history of dgplug and my journey

During a session of the summer training this year, someone asked about the history of DGPLUG and how I started contributing to it. The story of the dpglug has an even longer back-story about my history with Linux. I’ll start there, and then continue with the story of dgplug.

Seeing Linux for the first time

During my class 11-12, I used to spend a lot of time in the hostels of the Regional Engineering College, Durgapur, (or as we call it, REC, Durgapur.) This institute is now known as NIT, Duragpur. I got to lay my hands on and use a computer (mostly to play games) in my uncle’s hostel room. All the machines I saw there, were Windows only.

My Join Entrance Examination(JEE) center was the REC (year: 2001). That was a known & familiar environment for me. During breaks on day 1, I came back to hall 2 (my uncle’s room) for lunch. The room was unusually full; I think more than 7 people were looking at the computer with very anxious faces. One guy was doing something on the computer. My food was kept in the corner, and someone told me to eat quietly. I could not resist, and asked what was going on.

We’re installing Linux in the computer, and this is a very critical phase.
If the mouse works, then it will just work, or else we will not be able to use it in the system at all, someone replied

I had to ask again, What is this Linux? “Another Operating System” came the reply. I knew almost everyone in the room, I knew that they used computers daily. I also saw a few of them also writing programs to solve their chemical lab problems (my uncle was in Chemical Engineering department). There were people from the computer science department too. The thing stuck in my head, going back to the next examination, was that one guy, who knew something which others did not. It stuck deeper, than the actual exam in hand, and I kept thinking about it all day, that day. Later after the exam, I actually got some time to sit in front of the Linux computer, and then I tried to play with it, trying things and clicking around on the screen. Everything was so different than I used to see on my computer screen. That was it, my mind was set; I was going to learn Linux, since not many know about it. I will also be able to help others as required, just like my uncle’s friend.

Getting my own computer

After the results of the JEE were announced, I decided to join the Dr. B. C. Roy Engineering College, Durgapur. It was a private college, opened a year back. On 15th August, I also got my first computer, a Pentium III, with 128MB of RAM, and a 40GB hard drive. I also managed to convince my parents to get a mechanical keyboard somehow, (which was costly and unusual at the time) with this setup. I also got Linux (RHL 7.x IIRC) installed on that computer along with Windows 98. Once it got home, I kept clicking on everything in Linux, ran the box ragged and tried to find all that I could do with Linux.

After a few days, I had some trouble with Windows and had to reinstall it, and when I rebooted, I could not boot to Linux any more. The option had disappeared. I freaked out at first, but guessed that it had something to do with my Windows reinstall. As I had my Linux CDs with me, I went ahead and tried to install it again. Installing and reinstalling the operating systems over and over, gave me the idea that I will had to install Windows first, and only then, should I install Linux. Otherwise, we can not boot into Linux.

Introduction to the Command Line

I knew a few Windows commands by that time. Someone in REC pointed me to a book written by Sumitabha Das (I still have a copy at home, in my village). I started reading from there, and learning commands one by one.

Becoming the Linux Expert in college

This is around the same time when people started recognizing me as a Linux Expert; at least in my college.
Of course I knew how to install Linux, but the two major things that helped get that tag, were

  • the mount command, I knew how to mount Windows partitions in Linux
  • xmms-mp3 rpm package. I had a copy, and I could install it on anyone’s computer.

The same song, on the same hardware, but playing in XMMS always used to give much better audio quality than Windows ever did. Just knowing those two commands gave me a lot of advantage over my peers in that remote college (we never had Internet connection in the college, IIRC).

The Unix Lab & Introduction to computer class

We were introduced to computers in our first semester in some special class. Though many of my classmates saw a computer for the first time in their life, we were tasked to practice many (DOS) commands in the same day. I spent most of my time, helping others learn about the hardware and how to use it.

In our college hostel, we had a few really young professors who also stayed with us. Somehow I started talking a lot with them, and tried to learn as many things as I could. One of them mentioned something about a Unix lab in the College which we were supposed to use in the coming days. I went back to the college the very next day and managed to find the lab; the in-charge (same person who told me about it) allowed me to get in, and use the setup (there were 20 computers).

Our batch started using the lab, only for 2-3 days at the most. During the first day in the lab, I found a command to send out emails to the other users. I came back during some off hours, and wrote a long mail to one of my classmates (not going to talk about the details of the mail) and sent it out.

As we stopped using that lab, I was sure no one had read that mail. Except for one day, the lab in-charge asked me how my email writing was going on. I was stunned, how did he find out about the email? I was all splutteringly, tongue tied! Later at night, he explained to me, the idea of sysadmins, and all that a superuser can do in a Linux/Unix environment. I started thinking about privacy in the electronic world from that night itself :)

Learning from friends

The only other people who were excited about Linux, were two people from same batch in REC. Bipendra Shrestha, and Jitu. I used to spend a lot of time in their hostel and learned so many things from them.

Internet access and the start of dgplug

In 2004, I managed to get more regular access to the Internet (by saving up a bit more money to visit Internet cafes regularly). My weekly allowance was Rs.100, and regular one hour Internet access was around Rs.30-50.

While reading about Linux, I found the term LUG, or Linux User groups. As I was the only regular Linux user in college, I knew that I never had much chance to learn more on my own there, and that somehow I will have to find more people like me and learn together. Also around the same time, I started learning about words like upstream, contribution, Free Software, & FSF. I managed to contact Sankarshan Mukhopadhyay, who sent me a copy of Ankur Bangla, a Linux running in my mother tongue, Bengali. I also came to know about all the ilug chapters in India. That inspired me. Having our own LUG in Durgapur was my next goal. Soumya Kanti Chakrabarty was the first person, I convinced, to join with me to form this group.

The first website came up on Geocities (fun times), and we also had our yahoo group. Later in 2005, we managed to register our domain name; the money came in as a donation from my uncle (who by this time was doing his Ph.D. in IIT Kanpur).

I moved to Bangalore in July 2005, and Soumya was running the local meetings. After I started using IRC regularly, we managed to have our own IRC channel, and we slowly moved most of our discussion over to IRC only. I attended FOSS.IN in December 2005. I think I should write a complete post about that event, and how it changed my life altogether.

Physical meetings in 2006-2007

A day with Fedora on 4th April 2006 was the first big event for us. Sayamindu Dasgupta, Indranil Dasgupta, and Somyadip Modak came down for this event to Durgapur. This is the same time when we started the Bijra project, where we helped the school to have a Linux (LTSP) based setup, completely in Bengali. This was the first big project we took on as a group. This also gave us some media coverage back then. This led to the bigger meetup during 2007, when NRCFOSS members including lawgon, and Rahul Sundaram came down to Durgapur.

dgplug summer training 2008

In 2008, I pitched the idea of having a summer training over IRC, following the same rules of meetings as Fedora marketing on IRC. Shakthi Kannan also glommed onto the idea, and that started a new chapter in the history of dgplug.

Becoming the active contributor community

I knew many people who are better than me when it comes to brain power, but generally, there was no one to push the idea of always learning new things to them. I guess the motto of dgplug, “Learn and teach others” helped us go above this obstacle, and build a community of friends who are always willing to help.

শেখ এবং শেখাও (Learn & Teach others).

Our people are from different backgrounds, from various countries, but the idea of Freedom and Sharing binds us together in the group known as dgplug. Back in 2015 at PyCon India, we had a meeting of all the Python groups in India. After listening to the problems of all the other groups, I suddenly realized that we had none of those problems. We have no travel issues, no problems getting speakers, and no problem getting new people to join in. Just being on the Internet, helps a lot. Also, people in the group have strong opinions, this means healthy but long discussions sometimes :)

Now, you may have noticed that I did not call the group a GNU/Linux users group. Unfortunately, by the time I learned about the Free Software movement, and its history, it was too late to change the name. This year in the summer training, I will take a more in-depth session about the history of hacker ethics, and Free Software movement. and I know few other people will join in.

The future

I wish that DGPLUG continues to grow along with the members. The group does not limit itself only to be about software, or technology. Most of the regular members met each other in conferences, and we keep meeting every year in PyCon India, and PyCon Pune. We should be able to help other to learn and use the same freedom (be it in technology or in other walks of life) we have. The IRC channel should continue to be the happy place it always has been; where we all meet every day, and have fun together.

Second update from summer training 2017

We are already at the end of the second week of the dgplug summer training 2017. From this week onwards, we’ll have formal sessions only 3 days a week.
Guest lectures will start only from next month.
This post is a quick update about the things we did over the last 2 weeks.

  • Communication guideline review: We use Shakthi Kannan’s communication guideline and mailing list guideline in the training. We still have a few people having trouble with not typing SMS language in the chat or in the mailing list. But, we’re learning fast.
  • Basics of Linux command line tools: We used to have only the old logs and the TLDP bash guides for this. Last year I thought of writing a new book (along the lines of Python for you and me), but did not manage to push myself to do so. This year, I have started working on Linux command line for you and me, though I haven’t yet edited the things I’ve written yet. We are using this book as a starting point for the participants along with the old logs.
  • GNU Typist: We also ask people to practice typing. We suggest using gtypist to learn touch typing. There are a few blog posts from the participants talking about their experience.
  • We had two sessions on Vim. Sayan will take another special session on Vim in the coming days.
  • We talked about blogs, and asked everyone start writing more. Writing is important; be it technical documentation, or an email to a mailing list – it is the way to communicate over the Internet. We suggested Wordpress to the beginners. If you are interested to see what the participants are writing, visit the planet.
  • The Internet’s Own Boy: This week we also asked everyone to watch the story of Aaron Swartz. The summer training is not about just learning a few tools, or learning about projects. It is about people, about freedom. Freedom to do things, freedom to learn. One of our participants, Robin Schubert (who is a physicist from Germany) wrote his thoughts after watching the documentary. I am hoping that more participants will think more about what they saw.
    Following this, we will also have a session about the history of Free Software, and why we do, what we do. The ideology still matters. Without understanding or actually caring about the issues, just writing code will not help us in the future.
  • Next, we actually asked people to submit their RSS feeds so that we can add them to the planet. We also learned Markdown, and people noticed how Aaron was involved in the both.

In the coming days, we will learn about few more tools, and how to use programming to automate things in life. How to contribute patches to the upstream projects and such related things. But, we will also have sessions on software licenses. Anwesha will take the initial session on the same. The guest sessions will also start. If you are interested in teaching or sharing your story with the participants, please drop me a note (either email or twitter).

Updates on my Python community work: 16-17

Thank you, everyone, for re-electing me to the Python Software Foundation board 2017. The results of the vote came out on June 12th. This is my third term on the board, 2014, and 2016 were the last two terms. In 2015 I was out as random module decided to choose someone else :)

Things I worked on last year

I was planning to write this in April, but somehow my flow of writing blog posts was broken, and I never managed to do so. But, better late than never

As I had written in wiki page for candidates, one of my major goal last year was about building communities out of USA region. The warm welcome I have received in every upstream online community (and also in physical conferences), we should make sure that others should be able to have the same experience.

As part of this work, I worked on three things:

  • Started PyCon Pune, goal of the conference being upstream first
  • Lead the Python track at FOSSASIA in Singapore
  • Helping in the local PyLadies group (they are in the early stage)

You can read about our experience in PyCon Pune here, I think we were successful in spreading the awareness about the bigger community which stands out there on the Internet throughout the world. All of the speakers pointed out how welcoming the community is, and how Python, the programming language binds us all. Let it be scientific computing or small embedded devices. We also managed to have a proper dev sprint for all the attendees, where people did their first ever upstream contribution.

At FOSSASIA, we had many professionals attending the talks, and the kids were having their own workshops. There were various other Python talks in different tracks as well.

Our local PyLadies Pune group still has many beginner Python programmers than working members. Though we have many working on Python on their job, but never worked with the community before. So, my primary work there was not only about providing technical guidance but also try to make sure that the group itself gets better visibility in the local companies. Anwesha writes about the group in much more details than me, so you should go to her blog to know about the group.

I am also the co-chair of the grants working group. As part of this group, we review the grants proposals PSF receives. As the group members are distributed, generally we manage to get good input about these proposals. The number of grant proposals from every region has increased over the years, and I am sure we will see more events happening in the future.

Along with Lorena Mesa, I also helped as the communication officer for the board. She took charge of the board blog posts, and I was working on the emails. I was finding it difficult to calculate the amounts, so wrote a small Python3 script which helps me to get total numbers for every months’ update. This also reminds me that I managed to attend all the board meetings (they are generally between 9:30 PM to 6:30 AM for me in India) except the last one just a week before PyCon. Even though I was in Portland during that time, I was confused about the actual time of the event, and jet lag did not help either.

I also helped our amazing GSoC org-admin team, Terri is putting countless hours to make sure that the Python community gets a great experience in this program. I am hoping to find good candidates in Outreachy too. Last year, the PSF had funds for the same but did not manage to find a good candidate.

There were other conferences where I participated in different ways. Among them the Science Hack Day India was very special, working with so many kids, learning Python together in the MicroPython environment was a special moment. Watiting for this year’s event eagerly.

I will write about my goals in the 2017-18 term in a future blog post.

dgplug summer training 2017 is on

Yesterday evening we started the 10th edition of dgplug summer training program. We around 70 active participants in the session, there were a few people who informed us beforehand that they will not be available during the first session. We also knew that at the same time we had India-vs-Pakistan cricket match, that means many Indian participants will be missing the day one (though it seems the Indian cricket team tried their level best to make sure that participants stop watching the match :D ).

We started with the usual process, Sayan and /me explained the different rules related to the sessions, and also about IRC. The IRC channel #dgplug is not only a place to discuss technical things, but also to discuss about everyday things between many of the dgplug members. We ask the participants to stay online as long as possible in the initial days and ask as many questions as they want. Asking questions is a very important part of these sessions, as many are scared to do so in public.

We also had our regular members in the channel during the session, and after the session ended, we got into other discussions as usual.

One thing I noticed was the high number of students participating from the Zakir Hussain College Of Engineering, Aligarh Muslim University, Aligarh, India. When I asked how come so many of you are here, they said the credit goes to cran-cg (Chiranjeev Gupta) who motivated the first year students to take part in the session. Thank you cran-cg for not only taking part but also building a local group of Free Software users/developers. We also have Nisha, who is a fresh economics graduate, taking part in this year’s program.

As usual the day one was on Sunday, but from now on all the sessions will be on weekdays only unless it is a special guest session where a weekend is a better for our guest. Our next session is at 13:30PM UTC today, at the #dgplug channel on Freenode server. If you want to help, just be there :)

PyLadies Pune 2017 June meetup

Last weekend we had the PyLadies Pune June meet up. The day started with Shilpee Chamoli taking a data science 101 with Python. As few people were having trouble in installing Jupyter and the other dependencies, I suggested to use Azure notebooks, and a few of us did that. This was the first time I was attending a pandas workshop, even though I packaged it a few years back for Fedora. The simple problem related to Titanic data was fun to work on.

Next, Anwesha shared her experience in PyCon US, and she & Sayan also played the videos they recorded during PyCon. It was a surprise to all of the attendees. In the videos, we had messages from Lynn Root, Ewa Jodlowska, Carol Willing, and Jackie Kazil to the PyLadies Pune group members. Looking at the people’s faces, I think that was a really good idea from Anwesha.

After lunch, we started discussing project ideas. Sayan, Praveen Kumar, Chandankumar and me were helping the groups about the ideas. I will be mentoring a group of students for their final year project, which is about a command line bookmarking application, with future features like a corresponding web-frontend etc. Right now, they are busy with the most difficult part of the project, choosing a name for the project :)

Emails: my kryptonite

My first experience of Internet was in 1999, I also made my first ever email account on the same day. A lot of things have changed from then. I don’t have to pay Rs.120 per hour (which was way more than my monthly allowance at that time) anymore to access the Internet, or no long queue of waiting to get access to a computer in the Internet center. As a note, my village did not have an Internet center at that time, I had to visit the nearest town Durgapur to have access to the Internet.


Around 2005, I created my Gmail ID, kushaldas AT Gmail. I was happy to have one single final email ID, which I can keep using forever. And, it was like that for many years, till the time I suddenly figured out that I am getting way too many emails from different lists. I was not being able to follow all the emails anymore, even though I made many filters to move the emails to different labels (I thought they are normal folders, but it seems they are not in the case of Gmail). Deleting emails also became a troublesome work.

Suddenly, emails became one of my weakness. I was not being able to reply on time to everything, I was not being able to find the mails which I wanted. I started loosing emails. There are too many emails in my Gmail INBOX, sometimes the Google web interface happily crashed while I tried to delete a lot of emails together.


Meanwhile a few years back I moved to mutt as my primary email client. Things were better for the first few months, at least mutt was very fast to open my big INBOX than the other clients. Btw, the emails are all locally cached using the mbsync tool (part of isync package). But, then I discovered the problem of deleting the emails from the mutt interface for Gmail. The standard delete option just untags the mail, but it stays back in the All Mails folder. My Gmail storage space slowly started filling up.

I later figured out that I move my emails to the trash for a proper delete. But, that was taking more time. And, after filling up the free 15GB, I had to start paying to Google to have more space.

mail AT kushaldas.in

Around a year back, I created this email ID. This is maintained by Kolab, and they are simply awesome. A pure FOSS solution provided by folks who really value the freedom. The IMAP also works following the standard IMAP interface, no fancy things non-standard features like Gmail. I am using imapfilter tool to have more control over the mail filters. mutt is still the mail client for this account too. As I moved most of the work to this account, I still did not delete emails on time.

Clean up process

This week, during my regular 1x1 with my manager Paul Frields, he showed me some details about how he manages his Gmail account. I never knew so much about the search/filter system. I am yet to try out all the options he showed. But, yesterday, I once again tried to clean up.

First, I deleted all the different mailing list folders in my personal ID, just going to the folder, and then pressing D~A was enough to mark all the emails in that folder for deletion. After that, I had to go through the whole INBOX, and delete all the random emails, I kept selecting by pressing Ctrl+d, and in every few minutes pressed $ to clean up. Later, when I ran the mbsync tool, it synced and cleaned up space as I hoped for.

Now, for Gmail, I deleted emails using the web interface for the random mailing lists, and later when I tried to clean the trash, the web frontend was not that happy and crashed a few times. I was deleting emails in 66K, 20K, and 12K batches.

But, even after deleting more than 100K emails, I can find there are still around 236K emails in my main INBOX, and out of those around 30K are unread. I am going to take a lot more time to select them carefully, and then delete them. If you have any tips to mark emails to delete from the archive during this process, please let me know. That will help me a lot :)

Replying to emails

For the last few months, I am trying to reply to the emails as soon as I am reading them. Whenever I used to think that I will wait for some free time, and then write a response after thinking, I never sent out a reply to those emails :(. At least by replying then and there, things are better for me. I am replying to the emails on time and not loosing emails.