All posts by Eliezer Croitoru

The Squid “Persona”- Squid 3.5.21+ 4.0.14

Who Is The Squid Girl Persona?

squid_girl__shinryaku__ika_musume__minimalism_by_greenmapple17-d8u9mum

As a Squidder I know a girl or two but others are in the lookup for their Squid Girl Persona. Its’s not a simple task!!
To illustrate the idea: We as males used to the web with all sort of GET, POST, OPTIONS, PROFIND and all sort of normal methods.
But the Squid Girls is something else. She has brains!

I mean what would an empty shell be like? Would a Squidder would ever want a SHELL only system? Won’t some graphic or UI would be nice?
We do need some humen with enough Persona to take the SHELL up. As a friend told me once: We have great code, great kernel great compiler but.. with only one missing feature! The UI can be customized.
What can you say to someone like that? I mean, if the Software give you a full list of OPTIONS for the UI, what to choose from?

Despite to me preferring SHELL and scripting languages I believe that a good ICON and a GOOD Persona will give good taste for the whole cup.

Now let say we have a full stack proxy(AKA Router) and we can connect a network of Squids, what would be the next step?
Would it be plausible to “share” a cache_dir?
What about a ReadOnly cache_dir?
It sounds a bit weird but with good Cache Operators I believe that it would be possible to enhance a Squid-Cache network to serve a global Sharing system(compared to a CDN).

Google, Facebook, Yahoo MS, and couple others can be the main INPUT systems and the Squid-Cache Operators can push content into the right network at the right place and the right time.

The Squid Girl persona would be there just waiting for the Squidder and using a proxy, two, three, a BIG network two Squidders can be connected.

I am happy to say that from mu side of the picture Squid 4 feels stable to me and others needs to confirm that it work as expected for much larger systems.

Eliezer Croitoru

On the plate:

  • Squid full Tproxy LoadBalancer with The proxy protocol connection initiation.

Refrences:

Faster is not always the answer!!

I am happy to publish the article for:
Squid-Cache 3.5.20 and 4.0.12 beta release.

The details about the the RPMs repository are at squid-wiki.
RPMs Available for CentOS, Oracle Linux, OpenSUSE Leap

Faster is not always the answer!!

When clients are not complaining?
What I mean is, did you ever seen a client complains about the speed of the Internet connection? No I do not mean that he or she complains it’s too slow but that it’s too fast?

I had the pleasure to meet couple clients which complained that the Computer is moving slow since their Internet connection speed was upgraded. No it wasn’t a joke and it is reality.

The scenario needs some background and context to sound a bit more realistic:
The client is in the age of about 80 and the PC is 2-3 year old. When the Internet connection was slow, the OS updates and AV P2P connections was slow. Every day the computer got shutdown around a specific hour and if was required some updates was applied. Now the issue is that since the Internet speed got faster, every couple hours an update from the AV was applied and almost every couple days an OS update was back on the table. The main issue was speed but with a twist “when I am disconnecting the router it’s working faster” he states.
Actually it took me quite a while to understand that a simple Desktop with about 4GB RAM should be enough to use: Skype, Word, Email and couple console based tiny pieces of software.

So why? why did the PC got slower?
I really do not know! It could be lots of IOPS that was dumped on a 5400 RPMs HDD or that the AV scanned the 2GB of updates repeatedly. I cannot answer what I never understood and from what I understood, faster is not always the good answer. However I can try to imaging that to verify that every signature of a file is still the same as it should be, might not be so easy for every PC.

These days I am counting the 10th month which my local testing Squid runs in a “full” http responses digest mode. Every single response was digested using the SHA256 hashing function and it feels like it’s not there at all. It’s not affecting my tiny 15Mbps line rate downloads  or my tiny servers farm.
Ho well it’s not the full and the whole truth!!

The full truth is that the users agreed to use the service in any form since they care more about their mind and soul rather then their comfort. They decided that they need some filtering system when they insert some data into their mind through their eyes. It’s as simple as it sounds. They know that their mind should be guarded under couple NAT systems and couple IDS+IPS since there are couple weird ideas out there on the Internet.

I am asking myself couple times every single day the questions like:

  • How do you want others to treat you when you have some need?
  • Would you want that others will do everything for you?
  • How would a “Plate Of Gold” look like?

And then my IDS+IPS system throws on me a big fat text exception with the header “We are humans, we need others!”.

And indeed this is an IDS+IPS which I didn’t built and every once in a  while I am asking myself, how many digest functions are in there?

  • CRC32
  • MD5
  • SHA1
  • SHA256
  • SHA512
  • SHA1024
  • SHA∞ ?

Is there an AES based one also in there?
And my answer is that I do not know what’s in there but I can see some “reflection” of something greater and better. Then I start to wonder, why all these clients wants their so well formed and solid and mature mind to be proxied using any solution? Would any human made solution ever match our genes?

I cannot give any “scientific” opinion but I can bring to the table things from others which have more weight then me on them either from life experience or scientific research. These do claim that the human genes are not “perfect” and there for there is always a need to “spice” the human mind and soul in order to allow it some level of progress. The most simple example of humans being affected is that kids tries to learn from their parents and later with time they try to learn from others. This state of learning curve can teach us that genes are not “everything”.

The answer will not always be “Faster” if you will get to the state of understanding and believing that it’s a rocket to your mind that’s hitting using words, pictures, tables, shapes and other things.

But!! don’t get paranoid!! Enough that you have another person in the house next to you and you are safe enough to not loose your mind. Enough that there is someone that can be asked directly or using a proxy and this world already feels much better then it was couple seconds ago.

All The Bests,
Eliezer Croitoru

DNS as an API

I am happy to “Certify” Squid-Cache version 3.5.19 as
“Works For Me” on
CentOS(6+7), SLES(12SP1), Oracle Linux(6+7), RHEL(7), OpenSUSE(42.1 Leap), Debian(8.4), Ubuntu(14.04+16.04)

HTTP is commonly used as an API for many purposes in any industry and in many cases if you analyze an API specs and output you can see that some thinking was invested in it.

Around the Internet we can find many ideas about API’s and while some are well published others are long forgotten and are considered “old”. It is true that when you look at some of the API’s they might look “cryptic” or “malformed” but these have a purpose. Most of these APIs was meant to be public and as users we have access to all of them. But also many API’s requires some level of authentication or authorization which was clearly meant to not be fully public.

Some hackers around the world see the opportunity to “hack” something  when possible. From my own API’s which includes: HTTP, SMTP, DNS, WIFI HotSpot, Moblie and many others it is clear that some might think that it’s funny to send some malformed packets towards a Router or an AP. But I feel that there is a need to clear couple things out for any hacker.

Behind any System on the Internet there is some person which deserves respect. The fact that the API is there means that you are not allowed to hack it by it’s owner unless it was designed for it.
When comparing the real world to the Internet API’s not anyone can enter any door or any place. Not anyone can enter a closed party or a secured area. It would be a bit different since the minimum requirements to enter one place would not be the same for another.
For example, in the hackers world it’s known that there are ways to prove your value and earn your “nick” or “name”. Some hacking cultures are restrictive in their approach and respect any API avoiding the flame of war. While others think it’s better to hack some API as a Proof Of Concept or a Proof Of Knowledge.

White? Black? Green? Red? is there any meaning to all of these?
My answer is that all of these are hats, I do not have one and I do not want one. I am a simple person who has couple very simple API’s under his hands. But I learned that to give a good example is a profession. Specifically it’s not simple to give an example for a hacking kid. If any hacking kid wants to hack something, like in the real world, there are playgrounds for this sole purpose and an example would be canyouhack.it. Also these days if you want to learn how things work in the micro level we have Lots of free and open Virtualization platforms. These exist in any part of the Industry from the electricity level to the application.
All these tools was meant for the sole purpose of allowing the learning curve to be easy simple and safe, to use a real world power tool in an environment which will tolerate things which might not be acceptable in the real world API’s.

Not too far from the invention of HTTP the DNS system was invented and it’s an API like HTTP and many others. It is commonly used over UDP and has a very limited size and format but it has power in the same level as a button on a car dashboard. Technically it can and is being used in many places as a trigger to some system. Indeed UDP is not reliable at the same level of TCP but when the network equipment is trusted then there would be no reason to not use UDP.

A list of things that can be done using a DNS service messaging:

  • On\Off electrical switch
  • Identity signaling(AKA Port Knocking)
  • Banking transactions
  • Queue status updates
  • Alerts Signalling

And many other uses which can give an example to what an API can look like. I had the pleasure to read couple books about APIs published by Nordic APIs which gave me a fresh perspective on how others see an API and what might happen on the wild Internet that requires attention.

One key point which I learned from them is mentioned in the video “Good APIs aren´t built in a day”

And links to books from Nordic APIs  which I had the pleasure to read:

eBook Released: Securing the API Stronghold

API Security: The 4 Defenses of The API Stronghold

  • “Works For Me” means that it was tested on a testing environment under real world usage in a forward proxy mode with daily usage traffic such as Browsing News, Video, Learning and Games sites. Special applications that was tested are SKYPE, IRC and couple other applications inside a fully trusted network.
  • An Advice: Any system which sits against a non-trusted and a hostile public or private network should be “Harden” both in the squid configuration level and other lower levels.
  • This specific version(3.5.19) was tested also on Intercept proxy mode and ssl-bump but only on forward-proxy and not Intercept mode.

A Proxy for each Internet user! The future!

What a proxy is as a tool? is it a war or a life assisting tool?

The Internet is a reflection of the real world, and the world in general can at times be a war-zone but is more of a heaven. A proxy basically is an assisting tool to the warrior of Internet. We can give it a shape of a Squid or of a Katana, but the tool by itself is there to help. And despite to the fact that in the science fiction and fantasy world the image of such a tool might be one, the truth is that it can take multiple forms. Also not every Internet warrior needs the same tools as another. Some needs raw Internet while others needs a more digested one, based on the age and experience.

Compared to the first human which god have created, we are engaging the world in a much higher level then raw basic input and output. And since we are at about the 6k year since this world creation, we have an embedded proxy in each and one of us. Every pair of parents shares with the kids some amount of tools. Yes this tool of war which helps us to digest raw Internet Input and Output.
A wise man once told me “Your tongue  has lots of power, do not do harm!” and I was wondering to myself couple years about this fact.
I knew that we have power in our words but compared to the raw hardware we are in a much higher level. We all have a proxy embedded inside of us and this is a fact. Now the question which stands in-front of  every Internet user and admin is whether he wants to utilize this tool as an assisting glass to the lower levels of the Internet and build the next and higher level, or to harm what is already there.

Little by little in life we discover our proxy powers and we can choose to either take these into our hands and to do good, or to use these powers in a way that will shame our form as a creation of a much better world. Yes, despite to what many non-experienced kids say we have a a very good foundation but we need to maintain it.

Squid 3.5.15+16 release article

I am happy to release the article for the new squid 3.5.15+16
Until now the release was for the RPMs but I hope that from now on it will be much simpler to release every RPM.

The Fantasy Of Packaging
( A response to: Slamming Your Head Into Keyboard)

I was asked couple times “Why do you package?” and the simple answer is that I need it. But I must admit that it’s not because other developers and packagers doesn’t do a great job.
It’s just that I have couple specific systems that I operate and I need them to be up to date and until now I didn’t got the option to have someone that will take enough on his shoulders to make it happen in an Enterprise level.
And of-course it’s not that they do not want to, but since any Enterprise level distribution gives a good foundation for a very complex systems. When they develop and package they test and test and many other things until they can Insure themselves  and their clients.
It’s as simple as this. If in some case you have found an issue, they should help you with it by the level of contract you have with them.
For example since I have a Windows licensed PC and a licensed SLED Laptop I expect them to release at-least security updates.

I know that every major Linux Distribution does it’s job in a very good way. I know it since I can see their src.rpm packages and compare between my work to them.
Basically if you are a young man or a kid, the example from the scene at Toy Story which handy is opening the presents packages at the party and then he finds “Buzz light year” in a big box would describe to some how handling a package shouldn’t happen in the Enterprise level businesses.

(from about 09:00 to 14:00 minutes in the movie)

An admin should rely on the packages and packagers in general but he needs to plan and understand the nature of the update in order to anticipate what could be done in the update of the system.
If you have a big system which clients are using you cannot just run “yum update xzy” because you want to update. And specifically with squid you need a strategy that will allow you to sort of “bypass” this specific server and use another server in the mean while.
On small scale it’s simply adding another of squid instance on the same machine or to access squid using a haproxy LB.

It’s not really requires magic just basic skill knowledge and responsibility, if you are lacking one of these you should invest some time to acquire theses skills.

If for some reason for your or others business the generic packaging  doesn’t fit then you should understand that their aim was not you exactly but to another audience. And yes… it’s hard to see a package that does weird things like giving suid (chmod u+s XYX) to a file which actually is just a tiny script which should never run under root. But these things happens when the packager is novice enough to not understand they meaning of the work.

RPMs Automation: Here I come!!!

The squid-cache project and many others do not Package things in binary format and it’s important to understand that every Open-Source project probably have limited resources and goals and squid-cache is probably one of them.
All the sys-admins that uses my RPMs will probably enjoy them for a very long time but I have started to move from the manual hard labor RPMs creation to an automated one.
It means: That the packages will be pushed to the repository automatically for any squid release. My plan is to “certify” each of the RPMs or a release by a set of tests which are mainly manual rather then automated.
I haven’t decided yet what to do when a release fixed one bug but made another one but I will probably bump the RPMs version after basic testing.

And I would like to dedicate this amazing remix for all the amazing young IT industry mages that works many nights to give the Magical Internet be has great as it is now.

—www.youtube.com/watch?v=YFwoigs7Lhk
The above link was removed so another copy:

All The Bests,
Eliezer Croitoru

Squid 3.5.12 RPMs release

I am happy to release the new RPMs of squid 3.5.12 for Centos 6 64bit, 32bit and CentOS 7 64bit.

The new release includes couple bug fixes and improvements.
I have also took the time to build the latest beta 4.0.3 RPM for CentOS 7.
The details about the the RPMs repository are at squid-wiki.

Why 3DES (triple DES)? or The fall of DES.

It is known in the cryptography world that since 1997 DES(IE single DES) is vulnerable to some attacks  and there for is being considered to be unsafe for some uses. In order to resolve the DES issues the 3DES was implemented due to the ability to use the same fast cryptography machinery\chips that was used before and by that giving some time to the industry to find another more fit solution.
Some words about the DES encryption from Professor Gideon Samid:

Hashing compared to Encryption

The difference between hashing  to encryption is the ability to recreate the original digested content. Hashes are meant to allow some kind of content validation verification based on the low probability of  math collisions. To give a simple example about the subject we can use the Quadratic Formula:
Quadratic Formula
The formula defines that it is possible (or it is always the right answer) to have two answers to the same question\issue\variables.
Based on the fact\assumption that there is a possibility for two answers\solutions to the same unknowns+function we can use a function to describe more then one number. And in the case of computers which everything is some kind of a number we can convert the unknown numbers to octets.
Once there is no difference between numbers and\or octets and letters and we are in the function computation world. There we can use all sorts of functions\equations in order to describe all sorts of numbers and by that letters.
Eventually hashes are some kind of known functions which implements some way to reflect very big numbers or very big documents in some kind of output .  Technically speaking it’s some function\method that is guaranteed to reflect very big numbers with probability(high or low) that multiple input values  will be reflected with the same output number(128 bits for example).
In many levels of applications some hashes such as crc32\md5\sha-1\others are being used and these applications allow them-self  to validate content integrity with a fully “vulnerable” hash due to the fact that the validated content  do not exceed the function collision sizes.
I must admit that I have used MD5 and many other hashes for a very long time and the only collisions that I have seen that affected real world applications integrity are that of CRC32 hashes, maybe I have not seen enough yet!
And couple expert words from Professor Gideon Samid on hashing:

  • Disclaimer: I am not a cryptography expert!

This RPMs release was tested for:

  • ICAP 204\206 compatibility (non ssl)
  • ECAP passthru adapter which digest response body using SHA256
  • refresh_pattern variations
  • StoreID patterns
  • Basic load testing
  • Basic ssl-bump usage in strict forward proxy mode
  • Basic denial of memory leaks on a long period time of operation
  • Basic build tests

All the above was done on a CentOS 7 x86_64 VMs.
I have not tested everything on CentOS 6 since it is assumed that if it works good on CentOS 7 there should not be a special reason for it to not work on CentOS 6.

More details about the repository at squid-wiki.

All The Bests,
Eliezer Croitoru