my net house
WAHEGURU….!
Quantum Computer Init__
February 19, 2021
Posted by on Grover’s Algorithm: Algorithm that runs on Quantum computer to find specific information in Unmanaged Data.
DWave: Dwave is using 533 Qbit Computer to Solve challences in AI and Voice+ImageRecognization!!
Steps to Solve Rubik’s Cube in 2 mintues!(Practice Matters!)
December 16, 2020
Posted by on  Make a buttefy first: Yellow step in the center and White pieces adjacent to Yello y 90 degree.
 Match corner pieces adjacent to White Pieces with Center of other color and make two rotations.
 White Cross: After competing the Second step you will be having White Cross.
 Keep white cross in the botttom, White cross will remain entirely on the Bottom.
 Time to Perform LeftAnd Right Trigres:
 Solve the entire White Face: To sove entire White face you need to match white faces on the uppermost layer with center of same color and perform leftright trigger, whichever is required.
 After completing all left and right triggers you would be having complete White face Solved.
 NoNYello Pieces on the top: Find out each nonYeo pieces on the top layer and match corner pieces with center, After that Perform again left and right Corners.
But this time you have to do Left/Right Trigers twice in a Row:
 After Completing 8th step you would be having bottom layer and two side layers solved till now.
 Get Yello Cross on the top:
Now to get yellow cross on the top you need to perfom few Algorithms:
R = Rotate Rightside cloclwise.
R’ = Rotate Rightside Anticlockwise.
R2 = Rotate RightSide twice Clockwise.
So first Notation is as follows:
FURU’R’F’
If you have two Yellow pieces in the Edge which makes Single Yelow Line, turn the Yellow line towards you Again Perform FURU’R’F’
If you have two Edge yellow pieces which perform Backworf L then turn BackWard l away From you and Again Perform FURU’R’F’
After performing above steps you would be Having yellow corss on the Top, There could be other yelow pieces as well.
3. Now Solve the Entire Yellow Side
At this point you have to find Yellow color on the TopMost layer and bring it to lleft Corner and Perform following Algorithm:
RUR’URU2R
Again Perform following if you have Yellow corners on the Upper side layer. ‘
RUR’URU2R
After this you might have got FISH on the Top Yellow Side, Rotate Fish downward towards you down to left and perform algorithm again.
RUR’URU2R
4. Position the Corners of the Cube: (At 19:31) in the Video Link.
on the top side layer you have to match the side pieces. using following Algorithm:
L’URU’LUR’RUR’URU2R’
You might find that None of your Side Pieces have Same Color:
L’URU’LUR’RUR’URU2R’
5. One of the face matching the the corner Pieces:
Hold the face matchig the Corner Piece on your left hand and perfor the following Algorithm:
LURU’LUR’RUR’URU2R’
6. At this point all of your Corner Pieces Should Match:
If your Corner pieces don’t match you have to Match Step 5’s Algorithm one more time.
7. Final Step: . Positn the Edges::
If you are having one side solved till now keep it away from you.
F2UR’LF2L’RUF2
after that you need to swap Edge pieces CounterClockwise.
F2U’R’LF2L’RU’F2
If still None of your Side are solved:: > Perform CounterClockwise Algorithm.
F2U’R’LF2L’RU’F2
You migght have on Solved face, Keep it away from you and Perform CounterCockWse Algorithm one more Time.
F2U’R’LF2L’RU’F2
That’s It. Your Cube is Solved till now!!
Heap and Heap Sorts]draft[
December 14, 2020
Posted by on Heap Sort is bit different from Other types of SOrts.
Things we need in HeapSort:
 Priority Queue: Implement Set S of Elements, Each of Elements is Associated with Key.
 Insert Operation(): insert Element x into Set S
 Max(S): eturn element of S with largest key.
 ExtractMax(S): Extract max and remove it from S.
 IncreaseKey() : Increase the value of x’s key to new Value.
Heap a tree: Root of tree First ELement (i=1)
parent of i = i/2
left of i = 2i , right of i = 2i+1
MaxHeap: The key of Node>=Key of Children!
Big Question: how we maintain the MaxHeap() ?
Another Big Question: How we rae gong to Build MaxHeap?
HeapOperations:
build_max_heap(): Produces Max heap from unordered Array()
max_heapify(): Correct the single violation of heap.
Look for the children in the Condition and Check
Convert Array A[….n] into MaxHeap
Code is:
def build_max_heap(A):
for i=n/2 down to 1:
do max_heapify(A,i)
PresentComplexity: O(NlogN)
How to get Better Complexity like: O(N)?
Need to underStand ConvergenceSeries!
Conclusion in FiveSteps of HeapSorting:
 Build Max Heap from unordered array.
 Find/Assume Max element is A[1]
 Swap element A[1] with A[n] now maxElement is at end of the array.
 Discard N Element from the array and decrement the heap by n1
 Now new root may violate the max heap but children are “MaxHeap”
Again MaxHeapify() which means Loop to step2 to 5 until array size is not 1!!
Thenkkewwww
IBM Quantum Course Application
December 12, 2020
Posted by on My Passion for advanced technologies has grown tremendously from the day my father brought Pentium3 Intelbased computer at home, I was 14 years old at that time. My curiosity had taken me to complete graduation in InformationTechnology and working in the field of SoftwareDevelopment since 2013. I have taken multiple courses in my graduation like Advanced Data Structures, Machine learning, Discrete Mathematics, Microprocessor and assembly language programming, Digital Circuits, and Logic Design.
My Github Profile Link: https://github.com/arshpreetsingh
Over the past years, I have worked in multiple roles in the Software industry and taught myself various Programming languages like Python, Julia, C, C++, Scala, and GoLang. I have industrywide experience in Software development, Databasemanagement, Systemarchitecture, Data Operations, and Automated Tests. I always concentrate on using OpenSource technologies to cut down costs and making great use of Automation to decrease the cost across different areas.
I have contributed to Open Source Projects as well like TuxMath, TuxBlocks, Plyer, FreeCAD, and Open Street Mapping
Some of my featured Articles on Python, Julia, and Quantitative Finance has been published at medium.com as well.
[Python3 for Text Processing]
https://medium.com/@arshpreetsingh/pythonfortextprocessinge8fa81802a71
[Guide to Julia Programming]
https://medium.com/@arshpreetsingh/julia15da0fb911d6
[Hacker’s Guide to Quantitative Finance]
https://medium.com/@arshpreetsingh/hackersguidetoquantitativetradingalgorithmictradingpart1ea479ab2e790
I am working in FinanceIndustry and using Machinelearning, Statistics, Applied Mathematics to build financial models since 2017.
The finance industry is already using Pythsics and Applied Mathematics; few Examples are Bolinger Band, BlackScholesMerton model, Binomial Distributions, and Bayesian Statistics. Quantum Computing will be more efficient than classical computers to find patterns in data such will lead to the development of better Algorithms.
The detection of fraudulent activities using pattern recognization will be faster using Quantum computers in the financial world. Data will be more secure once encapsulated using quantumcryptography techniques. One cannot read data encoded in quantum states because they shapeshift by changing states and as such prevent eavesdropping, Usage of quantum computers for “ML and AI models” could lead to more optimized models and will be able to improve the accuracy of the model. I have explored the Applications of QuantumTechnology in Portfoliooptimisation, Credit Risk Analysis, and TimeSeries Analysis using QiskitTutorials from Github using this link(https://github.com/Qiskit/qiskittutorials/tree/master/tutorials/finance). Apart from that, I have also completed QiskitFoundations on youtube taught by Abraham Asfaw.
Using Quantum Computer technology, I want to develop useful Algorithms for Arbitrage and PortfolioRebalance, Which I am considering as the first goal in the journey of learning Quantum Computers.
Looking forward to diving into the Quantum Realm and coming out with new forms of life.
Quant topics to be Review and Go on
December 9, 2020
Posted by on Introduction to Algorithmic Trading [Done]
▪ Trend following Strategies [Not Done]
▪ Momentum based Strategies [not Done]
▪ Strategy Development [Done]
▪ Backtesting[Done]
▪ Performance measurement [not Done]
▪ Parameter Optimization [not Done]
▪ Money Management [not Done]
▪ Risk Management [not Done]
▪ Algorithm Trading Infrastructure Setup [Done]
▪ Algorithmic System Design and
Implementation [Done]
▪ API Integration – Integrating the Modules
with OMNESYS NEST PLATFORM [Done]
▪ Machine Learning for Quantitative
Trading Using Python [Done]
▪ Time Series Analysis Using Python [Done]
▪ Optimization Methods [not Done]
▪ Introduction to Quantitative Trading [not Done]
▪ Options Pricing [not Done]
▪ Options Greeks [not Done]
▪ Arbitrage Strategies [not Done]
▪ Options Trading Strategies [not Done]
▪ Volatility Trading Strategies [not Done]
▪ Statistical Arbitrage Strategies [not Done]
▪ Electronic Market Making [not Done]
▪ Execution Algorithms [not Done]
GraphAlgorithms in Python Part1
November 25, 2020
Posted by on Things required in GraphBaseClass:
 Number of vertices.
 Graph Type(Directed or undirected)
 Method to add Edges
 Method to find adjacent Vertices.
 Method to Get InDegree
 Method to Get Edge Weight
 method to Display the Graph
https://github.com/arshpreetsingh/GraphAlgos/blob/master/graph.py#L11
Adjacency Set: Use the folowing image to learn about AdjacencySet.
Node:
A single node in a graph represented by an adjacency set. Every node
has a vertex id, Each node is associated with a set of adjacent vertices
Link for Node Code here: https://github.com/arshpreetsingh/GraphAlgos/blob/master/graph.py#L45
AdjacencyGraph:
Represents a graph as an adjacency set. A graph is a list of Nodes
and each Node has a set of adjacent vertices.
This graph in this current form cannot be used to represent
weighted edges only unweighted edges can be represented
Inherit from Graph Class,
Number of vertices—Graph Type
1. Add multipleNodes in the init() method of the class.
2. Method to add vertices/Edges with some Checks(vertex should not be equal to zero and Vertices value should not be more than nuber of vertices )
if v1 >= self.numVertices or v2 >= self.numVertices or v1 < 0 or v2 < 0:
raise ValueError("Vertices %d and %d are out of bounds" % (v1, v2))
3. Get Adjacent Vertex.
For specific Location on Vertex List get adjacent value.
self.vertex_list[v].get_adjacent_vertices()
4. Get Iindegree of Vertex.
Find all the vertices which are connected with specifc vertex.
indegree = 0 for i in range(self.numVertices): if v in self.get_adjacent_vertices(i): indegree = indegree + 1 return indegree
5. Display a Graph:
iterate through list of vertices(self.vertex_list) after that iterate through each Node in the list to get_adjacent_verices!
def display(self):
for i in range(self.numVertices):
for v in self.get_adjacent_vertices(i):
print(i, ">", v)
https://github.com/arshpreetsingh/GraphAlgos/blob/master/graph.py#L68
AdjacencyMatrixGraph:
Represents a graph as an adjacency matrix. A cell in the matrix has
a value when there exists an edge between the vertex represented by
the row and column numbers.
Weighted graphs can hold values > 1 in the matrix cells
A value of 0 in the cell indicates that there is no edge
Rest of the methods will be Same as AdjacencyGraph. All operaitions will be replaced by matrix operarions insted of set().
Some Courses to Look into
October 30, 2020
Posted by on Mathematics for Machine Learning.
Quantum Physics Full Course.
Linear Algebra Full Course.
Computational Gemometry Course!
August 28, 2020
Posted by on Vanila Implementation of Financial Risk modeling in Python!
August 28, 2020
Posted by on For most of .
 Use historical data of stocks to Calculate historical Portfolio Variance.
 Use Factor Analysis Model to Calculate historical Portfolio Variance.
 Setup Scenario using Factor Analysis model.(Stress Testing!)
 Calculate Worst Case Scenario from Historical Data and Scenarios.
 Compare VaR(Value at Risk) for these cases.
What is Six Step Approach for Scenario Based Risk Model?
 Create a Basket of Financial Assests. (Each assest has uncertain “Returns”)
 Calculate the Standard Deviation of That Bucket of Financial Assests.
 Find out Systematic and ideoSyncractic risk on Each asset of the portfolio.
 Study those risk Factors and also compare with Historical Data.
 Generate Scenarios. Findout how Assest may perform in Future.
 Calculate WorstCase outcomes.
Scrap Data from Yahoo Finance for Mulitple Stock Symbols.
Assest Symbols are as follows:
["AAPL","ADBE","CVX","GOOG","IBM","MDLZ","MSFT","NFLX","ORCL","SBUX","^GSPC","VOOG"]
Find out code to Download Stock’s Data and Save to CSV files individually.
def download_csv_data(symbol_names=[],dir_path=''): for symbol in symbol_names: data_url="https://query1.finance.yahoo.com/v7/finance/download/{symbol}? period1=1283040000&period2=1598659200&interval=1d&events=history".format(symbol=symbol) data = requests.get(dataurl) with open(dir_path+symbol+".csv", 'w') as f: for line in data.iter_lines(): f.write(line.decode('utf8')+ '\n') return
Now combine all CSV files into into one DataFrame.
def combine_data(dir_path =""): data_frames=[] files = os.listdir(dir_path) for file in files: data_frame = pd.read_csv(file) data_frame = pd.DataFrame(data_frame["Adj Close"].values,columns [file.split(".")[0]]) data_frames.append(data_frame) result = pd.concat(data_frames, axis=1, sort=False) return result
Now we have DataFrame which Looks something like this
Now Portfolio assemble part is done.
Let’s calcuate Historical Risk, Understand carefully folllwing two lists. We have stock names on which we want to calculate Historical risk, and FactorNames based on which factors we want to calcuate the Historical Risk.
stockNames = ["AAPL","ADBE","CVX","GOOG","IBM","MDLZ","MSFT","NFLX","ORCL","SBUX"] factorNames = ["^GSPC","VOOG","Intercept"]
Let’s Calculate StockReturns and FactorReturns. Why?
Because that’s all we need to Callculate to find Historical risk on Financial assests.
Look for following mathematical expression to calcualte historical Risk .
Now calcuations!
stockReturns = returns[stockNames] factorReturns = returns[factorNames] weights = np.array([1.0/len(stockNames)] * len(stockNames)) historicalTotalRisk = np.dot(np.dot(weights,stockReturns.cov()),weights.T)
Now we have calculated the historical variance of our portfolio next step is to “Perform Factor Analysis” Systematic and Ideosyntractic.
Factor based model:
Decompose our model into Systemic and ideosyncractic Risks, Use this understanding for StressTesting Scenarios.
That would be our ScenarioBased model.
Systemic Risk:
IdeoSyncractic Risk:
Now we have chosen three RiskFactors,
1. S&P500 – Spread across the market –> Systemic Risk
2. VFISX – Could effect across indiviual Stocks –>>Ideosyncratic Risk
Here assumption is if Interest Rates are going UP Stock investment willl go Low.
so S&P500 and VFISX are somewhat oposite to each other interms of Risk Factors.
Total Var = SystematicVar(p)+IdesyncracticVar(p)
Risk Factor Analysis: Now let’s Express “returns on Every stock” w.r.t “FactorReturns” in terms
of regression Equation.
**This regression equation will tell us how much change in the RiskFactor wi effect the Returns of
each stock.
Residual = Observed value – predicted value
e = y – ŷ
Alpha = Stock Specific out Performance
Beta_ on factorF1
Beta_ on FactorF2
import statsmodels.api as sm xData = factorReturns modelCoeffs = [] for oneStockName in stockNames: yData = stockReturns[oneStockName] model = sm.OLS(yData, xData) result = model .fit() modelCoeffRow = list(result.params) modelCoeffRow.append(np.std(result.resid,ddof=1)) modelCoeffs.append(modelCoeffRow) print(result.summary())
we have calculates ResidualCofficents for each stock individually , using that let’s calculate Systematic and Ideosyncratic risk on Each stock.
SystematicRisk = Weignt*Factor(S)Beta Matrix* FactorsCovarianceMatrix*Transpose_BetaMatric*Tranpose_WeightMatrix
idiosyncraticRisk = sum(modelCoeffs[“ResidVol”] * modelCoeffs[“ResidVol”]* weights * weights)
factorModelTotalRisk = systemicRisk + idiosyncraticRisk
Systematic and Ideosyncractic Variance calculations
factorCov = factorReturns[["VOOG","^GSPC"]].cov() reconstructedCov = np.dot(np.dot(modelCoeffs[["B_FVX","B_SP"]], factorCov),modelCoeffs[["B_FVX","B_SP"]].T) systemicRisk = np.dot(np.dot(weights,reconstructedCov),weights.T) idiosyncraticRisk = sum(modelCoeffs["ResidVol"] * modelCoeffs["ResidVol"]* weights * weights) factorModelTotalRisk = systemicRisk + idiosyncraticRisk
We have calcualted RiskFactors on each model as well. Let’s move on to generate Scenarios for each stock.
Scenario baed Model:
Let’s conside two scenarios:
 S&P500 is at lowest point (Systemic Scenario)
 FVX is at lowest point point. (ideosyncractic scenario)
we are assuming that 5% stp change for S&P500 and 2% for FVX each day.
Let’s use Python to generate FVX scenarios and Spscenarios, wich indicate that how high and how low S&p500 and FVX could go in Future.
<span id="mce_SELREST_start" style="overflow:hidden;lineheight:0;"></span> fvxScenarios = np.arange(min(returns["FVX"]),max(returns["FVX"]),0.05) spScenarios = np.arange(min(returns["^GSPC"]),max(returns["^GSPC"]),0.02)
Let’s Test Scenarios with Each individual Stock.
scenarios = [] for oneFVXValue in fvxScenarios: for oneSPValue in spScenarios: oneScenario = [oneFVXValue,oneSPValue] for oneStockName in stockNames: alpha = float(modelCoeffs[modelCoeffs["Names"] == oneStockName]["Alpha"]) beta_sp = float(modelCoeffs[modelCoeffs["Names"] == oneStockName]["B_SP"]) beta_fvx = float(modelCoeffs[modelCoeffs["Names"] == oneStockName]["B_FVX"]) oneStockPredictedReturn = alpha + beta_sp * oneSPValue + beta_fvx * oneFVXValue oneScenario.append(oneStockPredictedReturn) scenarios.append(oneScenario)
We have obtained Hitorical Risk, Factor based Model risk, this time we will calculate Scenario based risk
scenarios = pd.DataFrame(scenarios) scenarios.columns = ["FVX","SP500","AAPL","ADBE","CVX","GOOG","IBM","MDLZ","MSFT","NFLX","ORCL","SBUX"] scenariosCov = scenarios[stockNames].cov() scenarioTotalRisk = np.dot(np.dot(weights,scenariosCov ),weights.T) So what we have done. <ol> <li>Calculated Historical Relationship</li> <li>Calculated Risk using Factor based models.</li> <li>Using factor Based models we generated ScenarioBased models.</li> </ol> <img class="alignnone sizefull wpimage2483" src="https://arshpreetsingh.files.wordpress.com/2020/08/didall.png?w=680" alt="didall" width="1344" height="759" /> Calculating VAR: P = Our amont invested in the Stocks sigma = Variance of Returns from (historical/Riskfactor/ScenarioStressTest baed model) Z = How much %age of loss we can bear in which is "Number of Standard Deviation" away from mean. <img class="alignnone sizefull wpimage2486" src="https://arshpreetsingh.files.wordpress.com/2020/08/thisone.png?w=680" alt="thisone" width="1343" height="758" /> Best way to Express VaR is as follows: <img class="alignnone sizefull wpimage2487" src="https://arshpreetsingh.files.wordpress.com/2020/08/bestway.png?w=680" alt="bestway" width="1361" height="787" /> <span id="mce_SELREST_start" style="overflow:hidden;lineheight:0;"></span> <strong> Calculations of VaR using Python </strong> from scipy.stats import norm import math confLevel = 0.95 principal = 1 numMonths = 1 def calculateVaR(risk,confLevel,principal = 1,numMonths = 1): vol = math.sqrt(risk) return abs(principal*norm.ppf(1confLevel,0,1)*vol*math.sqrt(numMonths)) print (calculateVaR(scenarioTotalRisk,0.99)) print (calculateVaR(historicalTotalRisk,0.99)) print (calculateVaR(factorModelTotalRisk,0.99))
Thing that Matters most is Eastimation of Volitility, Which is Sigma.
MultiPeriod Var = VaR x Sqrt(number of trading Periods)
Advantages of VaR:
 Helpful to modeling worst Case outcomes.
 Sanctioned in regulations and risk accords.
 Even understood by NonFinance people.
 Measured and Reported objectively.
 easy to aggregate across assests to create End to end Risk matric
DisAdvantages of VaR:


 For short time assumptions

 Multiperiod VaR assumes that Loss will be same in Each Trading Period, which is ver bad assumption to make. Stick to Single period VaR whereever possible.
 VaR depends on Standard Deviation only.
 Skewness and Curtosis are ignored.
 VaR is only as Good as Variance Plugged into it.
Funtionaly Funtions in Pythonistic Python(s) by Pythonista! Part2
August 27, 2020
Posted by on 6. having a wrapper outside and inside.
Importance of function wrapper is handling your data behaviour but also making sure , your wrapper is able to handle any kind of behaviour.
>>> def escape_unicode(f):
... def wrap(*args, **kwargs):
... x = f(*args, **kwargs)
... return ascii(x)
... return wrap
...
>>> def northern_city():
... return 'Tromsø'
...
>>> print(northern_city())
Tromsø
>>> @escape_unicode
... def northern_city():
... return 'Tromsø'
...
>>> print(northern_city())
'Troms\xf8'
7. Using Class to create wapper/Decorator
Any class level attribute inside __call__() would be a Attribute for function in which Class is Wrapped around!
>>> class CallCount:
... def __init__(self, f):
... self.f = f
... self.count = 0
... def __call__(self, *args, **kwargs):
... self.count += 1
... return self.f(*args, **kwargs)
...
>>> @CallCount
... def hello(name):
... print('Hello, {}!'.format(name))
...
>>> hello('Fred')
Hello, Fred!
>>> hello('Wilma')
Hello, Wilma!
>>> hello('Betty')
Hello, Betty!
>>> hello('Barney')
Hello, Barney!
>>> hello.count
4
8. A wrapper inside class __call__() can be used to Turn Wrapper On and Off.
>>> class Trace:
... def __init__(self):
... self.enabled = True
... def __call__(self, f):
... def wrap(*args, **kwargs):
... if self.enabled:
... print('Calling {}'.format(f))
... return f(*args, **kwargs)
... return wrap
...
>>> tracer = Trace()
>>> @tracer
... @escape_unicode
... def norwegian_island_maker(name):
... return name + 'øy'
...
>>> norwegian_island_maker('Llama')
Calling <function escape_unicode.<locals>.wrap at 0x103b22ee0>
'Llama\\xf8y'
>>> norwegian_island_maker('Python')
Calling <function escape_unicode.<locals>.wrap at 0x103b22ee0>
'Python\\xf8y'
>>> norwegian_island_maker('Troll')
Calling <function escape_unicode.<locals>.wrap at 0x103b22ee0>
'Troll\\xf8y'
>>> tracer.enabled = False
>>> norwegian_island_maker('Llama')
'Llama\\xf8y'
>>> norwegian_island_maker('Python')
'Python\\xf8y'
>>> norwegian_island_maker('Troll')
'Troll\\xf8y'
9. Map would be able to accept multiple arguents if Functions who is being mapped able to accept Multiple argument.
>>> colors = ['lavender', 'teal', 'burnt orange']
>>> animals = ['koala', 'platypus', 'salamander']
>>> def combine(size, color, animal):
... return '{} {} {}'.format(size, color, animal)
...
>>> list(map(combine, sizes, colors, animals))
['small lavender koala', 'medium teal platypus', 'large burnt orange salamander'
]
10. List Comprehensions can be more lazy and complex with multipe For Loops and If statements.
>>> values = [x / (x  y) for x in range(100) if x > 50 for y in range(100) if x
 y != 0]
>>> values = [x / (x  y)
... for x in range(100)
... if x > 50
... for y in range(100)
... if x  y != 0]
>>> values = []
>>> for x in range(100):
... if x > 50:
... for y in range(100):
... if x  y != 0:
... values.append(x / (x  y))
...
11. list comprehensions could be Nested as well.
>>> vals = [[y * 3 for y in range(x)] for x in range(10)]
>>> outer = []
>>> for x in range(10):
... inner = []
... for y in range(x):
... inner.append(y * 3)
... outer.append(inner)
...
>>> vals
[[], [0], [0, 3], [0, 3, 6], [0, 3, 6, 9], [0, 3, 6, 9, 12], [0, 3, 6, 9, 12, 15
], [0, 3, 6, 9, 12, 15, 18], [0, 3, 6, 9, 12, 15, 18, 21], [0, 3, 6, 9, 12, 15,
18, 21, 24]]
12. Some ideas for Code interospection!
 type() is of class type
>>> repr(int)
"<class 'int'>"
>>> type(i) is int
True
>>> type(i)(78)
78
>>> type(type(i))
<class 'type'>
>>> i.__class__
<class 'int'>
2. isinstance and issubclass is typechecking subclass of class Type()
>>> issubclass(type, object)
True
>>> type(object)
<class 'type'>
>>> isinstance(i, int)
True
3. To check if class object has specific attibute Eists.
>>> getattr(a, 'conjugate')
<builtin method conjugate of int object at 0x10ff2cfb0>
>>> callable(getattr(a, 'conjugate'))
True
>>> a.conjugate.__class__.__name__
'builtin_function_or_method'
>>> getattr(a, 'index')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'int' object has no attribute 'index'
>>> hasattr(a, 'bit_length')
True
>>> hasattr(a, 'index')
False
4. Globas() and Locals() is Dict whch keep track of your globals and Locals.
>>> globals()
{'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <cl
ass '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {
}, '__builtins__': <module 'builtins' (builtin)>}
>>> a = 42
>>> globals()
{'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <cl
ass '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {
}, '__builtins__': <module 'builtins' (builtin)>, 'a': 42}
>>> globals()['tau'] = 6.283185
>>> tau
6.283185
>>> tau / 2
3.1415925
>>> locals()
{'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <cl
ass '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {
}, '__builtins__': <module 'builtins' (builtin)>}
>>> def report_scope(arg):
... from pprint import pprint as pp
... x = 496
... pp(locals(), width=10)
...
>>> report_scope(42)
{'arg': 42,
'pp': <function pprint at 0x10e70bee0>,
'x': 496}
>>>
5. import inspect.
Just like Golang’s reflect package.
>>> def num_vowels(text: str) > int:
... return sum(1 if c.lower() in 'aeiou' else 0
... for c in text)
...
>>> import inspect
>>> sig = inspect.signature(num_vowels)
>>> sig.parameters['text']
<Parameter "text: str">
>>> sig.parameters['text'].annotation
<class 'str'>
>>> sig
<Signature (text: str) > int>
>>> sig.return_annotation
<class 'int'>
>>> num_vowels.__annotations__
{'text': <class 'str'>, 'return': <class 'int'>}