What Python’s “magic methods” are and how you would use them to make a simple account class more Pythonic.
In Python, special methods are a set of predefined methods you can use to enrich your classes. They are easy to recognize because they start and end with double underscores, for example init or str.
These “dunders” or “special methods” in Python are also sometimes called “magic methods.”
- Dunder methods let you emulate the behavior of built-in types. For example, to get the length of a string you can call len('string').
class NoLenSupport:
pass
>>> obj = NoLenSupport()
>>> len(obj)
TypeError: "object of type 'NoLenSupport' has no len()"
To fix this, you can add a len dunder method to your class:
class LenSupport:
def __len__(self):
return 42
>>> obj = LenSupport()
>>> len(obj)
You can see Python’s data model as a powerful API you can interface with by implementing one or more dunder methods.
Throughout this article I will enrich a simple Python class with various dunder methods to unlock the following language features:
- Initialization of new objects
- Object representation
- Enable iteration
- Operator overloading (comparison)
- Operator overloading (addition)
- Method invocation
- Context manager support (with statement)
Right upon starting my class I already need a special method. To construct account objects from the Account class I need a constructor which in Python is the init dunder:
class Account:
"""A simple account class"""
def __init__(self, owner, amount=0):
"""
This is the constructor that lets us create
objects from this class
"""
self.owner = owner
self.amount = amount
self._transactions = []
It’s common practice in Python to provide a string representation of your object for the consumer of your class (a bit like API documentation.) There are two ways to do this using dunder methods:
-
repr: The “official” string representation of an object. This is how you would make an object of the class. The goal of repr is to be unambiguous.
-
str: The “informal” or nicely printable string representation of an object. This is for the enduser.
In order to iterate over our account object I need to add some transactions
This polymorphic behavior is possible because these objects implement one or more comparison dunder methods. An easy way to verify this is to use the dir() builtin:
>>> dir('a')
['__add__',
...
'__eq__', <---------------
'__format__',
'__ge__', <---------------
'__getattribute__',
'__getitem__',
'__getnewargs__',
'__gt__', <---------------
...]
In Python, everything is an object. We are completely fine adding two integers or two strings with the + (plus) operator, it behaves in expected ways:
>>> dir(1)
[...
'__add__',
...
'__radd__',
...]
You can make an object callable like a regular function by adding the call dunder method.
class Account:
# ... (see above)
def __call__(self):
print('Start amount: {}'.format(self.amount))
print('Transactions: ')
for transaction in self:
print(transaction)
print('\nBalance: {}'.format(self.balance))
My final example in this tutorial is about a slightly more advanced concept in Python: Context managers and adding support for the with statement.
Similar to the previous post, this article assumes no prior knowledge of statistics, but does require at least a general knowledge of Python and general data science worflows.
- Prerequisites: If you are uncomfortable with for loops and lists, I recommend covering them briefly in our free introductory Python course before progressing.
What is probability? An event is some outcome of interest.The quintessential representation of probability is the humble coin toss. In a coin toss the only events that can happen are:
- Flipping a heads
- Flipping a tails These two events form the sample space, the set of all possible events that can happen. To calculate the probability of an event occurring
Our data will be generated by flipping a coin 10 times and counting how many times we get heads.
The normal distribution refers to a particularly important phenomenon in the realm of probability and statistics.
The normal distribution is significant to probability and statistics thanks to two factors:
-
the Central Limit Theorem If we make many estimates, the Central Limit Theorem dictates that the distribution of these estimates will look like a normal distribution.
-
the Three Sigma Rule. The Three Sigma rule, also known as the empirical rule or 68-95-99.7 rule, is an expression of how many of our observations fall within a certain distance of the mean,The Three Sigma rule dictates that given a normal distribution,
-
Z-score The Z-score is a simple calculation that answers the question, “Given a data point, how many standard deviations is it away from the mean?” The equation below is the Z-score equation.