More than 2 years ago (that's a lot of time!) I published a simple Python script to monitor a Twitter account using Tweepy: basic account information, inactive friends and new/lost followers. But this script stopped working some time ago because Twitter updated its API to version 1.1. This update made obligatory using authentication to make any request and they also modified the request limits. Before the update, there was a limit of 150/350 requests per hour, depending on whether the request was authenticated or not, but now these limits are per request type and per 15 minutes. For example, to get a list of friends you can make a maximum of 15 requests per quarter of hour, but you can make other 15 to get a list of followers. If someone is late (like me) with the new API here you can find the full changelog.
Before starting to modify the code I had to update the Tweepy version too (2.1). The best and easiest way is using pip:
Description: Program based on readnfccc (by Renaud Lifchitz) to read some private data from credit cards, like cardholder, Permanent Account Number (PAN), expiry date, etc., using NFC technology. It has been tested with Spanish contactless credit cards, but can also be used with other countries cards. Take a look at this post (Spanish) and this video.
I'm going to lay aside PDF files and malware to write a simple script to control friends and followers on Twitter. We use to have a lot of them and it's difficult to know if our friends haven't written some time ago or our followers have left. But we can use one of the multiple modules (talking about Python) to communicate with the Twitter API and solve this task. I've chosen Tweepy because I think it's very simple and well documented. What we want to obtain from Twitter is:
It's important to highlight that we cannot obtain all the friends/followers with one API request but only 100 each time. We can use the Cursors object from Tweepy to solve this very easily:
followersCursor = tweepy.Cursor(tweepy.api.followers,id=user) for follower in followersCursor.items(): print follower.name
Description: Little script to obtain a printable (C style) shellcode from the escaped Javascript code. It also writes to shellcode.out the resulted bytes.
Description: Script to analyze malicious PDF files containing obfuscated Javascript code. It uses Spidermonkey to execute the found Javascript code and showing the shellcode to be launched. Sometimes it's not able to deobfuscate the code, but you can specify the parameter -w to write to disk the Javascript code, helping to carry out a later manual analysis. Its output has five sections where you can find trigger events (/OpenAction and /AA), suspicious actions (/JS, /Launch, /SubmitForm and /ImportData), vulnerable elements, escaped bytes and URLs, which can be useful to get an idea of the file risk.
Description: This script compress/decompress a specified string or file using the Zlib library and writes to the standard output. If the input is a file and the method used is decompression, then the script looks for the streams compressed with the /FlateDecode filter, so it's focused on PDF files. If there is no filters in the file, the whole file is considered as a stream.
Description: Script that implements a XOR bruteforcing of a given file, although a specific key can be used too. It's possible to look for a word in the xored result, minimizing the output.
Usage: xorBruteForcer -k xor_key file [search_pattern] Arguments: file: the source file to be xored. search_pattern: pattern that must be found in the xored result. Options: -k xor_key: key used in the XOR function (00-ff). If not specified, all the possible values will be tested (bruteforcing).
# xorBruteForcer -k 25 geoloc > geoloc_xored_25 # xorBruteForcer geoloc_xored_25 GEoIpTOOl > out Pattern found using the following keys: ['0X5', '0X25']
Description: Script which returns a list of hostnames of the given domain (and their resolved IPs) contained in the given URL. The request can be recursive. This is useful to map all the hosts of one organization.
Usage: hostsGrabber [-r] url [domain] Arguments: url: the URL of the page which must be searched for hostnames. domain: the domain of the hostnames to look for. By default it's the url domain. Options: -r: recursive
Host Name: bs-in-f104.1e100.net IP Address: 64.233.163.104 Country: United States Region: California City: Mountain View Longitude: -122.0574 Latitude: 37.4192