Author: mannetjieopennetworkscom

Copy to Clipboard in Google App Maker

One lovely summers day Aragon was busy constructing an app for the Great Client when Surge, one of Cloud Valley’s most trusted communication runners, burst into Aragon’s dev bubble. Aragon closed his eyes fighting off the frustration as he turned around with the thought that it will take him another twenty minutes to reconstruct another bubble and continue spinning code spells. As he opened his eyes and saw the look of dread on Surge’s face all frustration was forgotten.

“Whats going on?” Aragon asked as Surge tried to get out the words between getting air back into his lungs. “Creep….. creep…. there is creep” is all Surge could get out. Aragon felt a surge of adrenalin curse through his body as the words hit him. “Breathe Surge, I need details” Aragon said as he tried to calm his friend. Between gasping for air Surge held out the enhancement scroll to Aragon and explained. “The Great Client submitted an enhancement that caused creep to start forming in the east side of Cloud Valley, and its tied to your cloud app.”

Aragon took the scroll from Surge and unrolled it. He could clearly see that the enhancement was hexed, causing the creep to appear. The enhancement short and clear “Copy to Clipboard button on App Maker platform” glowed with the dark magic running through it. “Thank you Surge, we will have to move fast on this one if we will have any hope to stop the creep as the spell on this one is strong” Aragon said while turning back to his console already pulling up search screens and access panels.

Doing a quick search through the wisdom scrolls did not provide any help and the internet synapse was also no help. With time of the essence and the creep rolling into Cloud Valley like a tsunami, Aragon remembered that somewhere in his personal archive he had a code spell. Digging through his archive he found what he was looking for, a function spell that would do exactly what was needed.

function copyTextToClipboard(text) {
var textArea = document.createElement(“textarea”); = ‘fixed’; = 0; = 0; = ‘2em’; = ‘2em’; = 0; = ‘none’; = ‘none’; = ‘none’; = ‘transparent’;
textArea.value = text;
try {
var successful = document.execCommand(‘copy’);
var msg = successful ? ‘successful’ : ‘unsuccessful’;
console.log(‘Copying text command was ‘ + msg);
} catch (err) {
console.log(‘Oops, unable to copy’);

Aragon put the function spell into the repository pipeline and looked as it blasted off into the distance. As it entered the repository the creep stopped moving for a second and then slowly receded back into nothing.

Aragon fell back into his chair with a great sigh of relief, seeing Surge standing with a big smile on his face. Cloud Valley was save for another day.


From Flat to FAT

Disclaimer: This is not a anti diet blog post

So there I was, in my left hand I had a CSV scroll straight out of the AD (Active Directory), listing 16000 folders and the specific access rights associated with them. In My right hand I had a flat stone Google Drive tablet with a list of folder names and folder ID’s chiseled on it, without any chains of security applied.

What the Migration Clan decided to do was to make a copy of the Great Client’s file server into a Google Drive. This posed the problem of assigning the correct access rights to all 16000 folders in the Google Drive, a daunting task that would of taken weeks to do manually.¬†Hence the Migration Clan requested the help of the Development Wizards.

Assigning access rights to Google drive folders are as easy as one REST call, so that would not be the problem. The problem here was that I had a list of folders in a tree structure ( E:\folder1\subfolder1\subfolder2 ) and Google Drive is a flat structure linked with ID’s and not folder names. I consulted the stackoverflow scrolls, but the only advise I could get was that the wise men suggested to rebuild the flat structure into a tree structure and work from there. So to share here is the python spell I came up with:

First I got an array of all the folders in the Google Drive:


gFolderArray = []
results = drive_service.files().list(q="mimeType='application/'", pageSize=1000,fields="nextPageToken, files(parents, id, name, mimeType)").execute()
items = results.get('files', [])
for item in items:
    if 'parents' not in item:
folderitem = {'name': item['name'], 'id': item['id'], 'parentID': '0ACgdWVFodebfUk9PVA'}
        folderitem = {'name': item['name'], 'id': item['id'], 'parentID': str(item['parents'][0])}
while "nextPageToken" in results:
    results = drive_service.files().list(q="mimeType='application/'", pageSize=1000, pageToken=results["nextPageToken"],fields="nextPageToken, files(parents, id, name, mimeType)").execute()
    items = results.get('files', [])
    for item in items:
        if 'parents' not in item:
            folderitem = {'name': item['name'], 'id': item['id'], 'parentID': '0ACgdWVFodebfUk9PVA'}
            folderitem = {'name': item['name'], 'id': item['id'], 'parentID': str(item['parents'][0])}

Now that I have an Array with the folder names, id’s and parent id’s I can start weaving the pythonic magic:

searchDict = gFolderArray
for folderdict in gFolderArray:
finalflag = True
fPath = folderdict[‘name’]
newparent = folderdict[‘parentID’]
while finalflag:
if newparent == ‘####……’:
fPath = ‘root\\’ + fPath
finalflag = False
newparentTemp = “”
for fparent in searchDict:
if fparent[‘id’] == newparent:
newparentTemp = fparent[‘parentID’]
pName = fparent[‘name’]
newparent = newparentTemp
fPath = pName + ‘\\’ + fPath
folderdict[‘path’] = fPath

Now lets have a look at what I do here. Firstly I create a duplicate array to search from. Why you ask? Well I had all kinds of issues trying to access the same array for two processes. So with a time limit the easiest was to just make a duplicate array. The “finalFlag” is there to signal when we have reached the root folder the we identify by it’s ID : “newparent == ‘####……’:”

So after spinning up some magic the Migration Clan could go back the the Great Client with happy faces and saving the Cloud valley form the wrath of the Great Client.

Faking it

Ever needed to let your GCP code act like it is someone specific on the domain? This is a situation that pops up much more these days in my line of work. Need to put something in a specific calendar, need to move that file to a specific Google Drive… The list goes on and on.

So the question is can this be done and the answer is YES. With delegated credentials this is possible. How much extra work? The good news is that it is one extra click and one extra line of code. Lets have a look how this works.

You will need a service account to make this work. So lets have a look at the steps. On the Google Cloud Console got to “IAM & admin” -> “Service Accounts”

Screen Shot 2017-08-28 at 11.44.16 AM

Then make sure to create an account with the “Enable G Suite Domain-wide Delegation” option ticked:

Screen Shot 2017-08-28 at 11.45.05 AM

Remember to give the service account API access on the Admin panel of the domain. ( I assume that you know how to do this. If not, just Google it there are many help pages on that)

So from here lets have a look at the code ( in Python today ):

scopes = ''
credentials = ServiceAccountCredentials.from_json_keyfile_name('./serviceAccountKey.json', scopes)
delegated_credentials = credentials.create_delegated('')
http_auth = delegated_credentials.authorize(httplib2.Http())
directoryService ='admin', 'directory_v1', http=http_auth)

First we set up the scopes for the application. Secondly we use the service account credentials (JSON key file) to create the credentials object. Then the next line is where the magic happens.

We use the create_delegated function to specify the user we want to “emulate” or delegate to be. After the http_auth we build the service and access it AS the user specified.

So if you need to FAKE it, thats how do it ūüėČ

Time Travel

Sometimes I can just kick my own backside. I believe a feeling all developers have felt at some stage (like forgetting a semicolon).

So I have a cron job on Google AppEngine that stops and starts a server for a client every day ( don’t ask why ). For the last week that this cron job have been running it stops the server but every morning the server will not be started.

So I tested my code and it will run perfectly every time. So this morning after the server did not startup AGAIN, I had a second look at the cron jobs and thats when it happened. I forgot to set the time zone for the cron jobs. It was running on GMT time and not our local time zone. That set my cron jobs out with two hours. So the server would startup but not at the time I wanted it to start.

So when setting up a cron job in AppEngine please remember to set the timezone so that you do not have to sit with the same issue I have. It is as easy as adding the following line in your schedule:

timezone: Australia/NSW

Your timezone can be found at the following wikipedia page:

So do not let your code time travel.

Gmail Logs in BigQuery – The Plot Thickens

As you all know I have been knee deep in the gmails logs that gets streamed into BigQuery.

Todays musing is just a short heads up on something that we have found in the logs.

While doing a search on a specific email for some reason I have been getting duplicates in my results. So here is what is happening. I do a SQL query with message_info.source.address, message_info.flattened_destinations and message_info.subject in the WHERE clause equal to specific search criteria. This then produces duplicate results of email. Meaning that a single email would be shown twice. Now this got my mind spinning.

So I started digging. I first got the message ID of the email and did a select on just that specific message ID (message_info.rfc2822_message_id). Having a look at the results I have found the following. Firstly the email gets listed with each message_info.message_set.type, that looks all good. Then the email gets listed with the same set types but without any subject in the log. Lastly the email gets listed in the log again (and we are talking micro seconds apart) with a message_info.message_set.type of 16. That was the reason for the duplicates.

Now Set Type 16 is not documented so I have no idea what is happening there. I have logged a support call with Google on this for further explanation and will be keeping you guys up to date.

In the mean time to make sure my queries do return correct results I have added the following two lines in my WHERE clause.

message_info.source.address <> 
and message_info.message_set.type <> 16

Till Next Time ūüôā

Mounting Google Storage on an OpenVZ Debian Jessie container

So I have a few OpenVZ containers with Debian Jessie running on them. I needed to be able to move files from the container to Google Storage. Jumped on Google and started looking at my options. GCSFUSE was the answer I got.

Now as any documentation you get out there half of it works and half does not and you end up spending a lot of time searching for weird and wonderful errors. So this is what I found and what worked for me.

Firstly you need to enable access for the containers to the fuse file system on the host. On the Host system run the following commands:

modprobe fuse 
vzctl set VEID --devnodes fuse:rw --save

Where VEID is the veid of the container you want to give access to.

I spin up a new Debian Jessie container and run the following commands:

apt-get update
apt-get install sudo
apt-get install curl
apt-get install apt-transport-https

These packages are not installed by default so installing these gets the basic applications for the installation. Then we add the correct repository to apt for easy installation:

export GCSFUSE_REPO=gcsfuse-`lsb_release -c -s`
echo "deb $GCSFUSE_REPO main" | sudo tee /etc/apt/sources.list.d/gcsfuse.list
curl | sudo apt-key add -

Then I install the gcsfuse package on the system.

apt-get update
apt-get install gcsfuse

Next we create the mount point where we want to mount our Google storage bucket.

mkdir /target/folder

There is a bug in Debian Jessie where the rights to the fuse device is not correct so we need to make sure we have the access we need:

chmod g+rw /dev/fuse

After this it is a good idea to do a good old reboot of the system to make sure everything starts up and works as expected. I have found a lot of forum posts where after all the setup, people were getting errors ( me included ) and a reboot of the system got everything working as expected.

Now if you have not done the setup on your Google cloud project, here is a quick rundown of what needs to be done.

  1. If you have not created a project on your Google Cloud Console, I would suggest popping over there and create it.
  2. Create your storage bucket.
  3. Create a Service Account and give it the role of Storage – Storage Object Creator
  4. With the service Account download the JSON key file.
  5. Remember to give the Service account access to your bucket or else you will get Authorisation errors.

The Steps above are well documented and a quick Google search will furnish you with what you need to know.

Make sure to copy your key file to your container.

This brings us to the moment of truth, as they say. Mounting the storage bucket to our file system:

gcsfuse --key-file /path/keyfile.json bucket-name /target/folder

At this point your Storage bucket should be mounted to your target folder and you can interact with it like any other folder on your system. For a last bit of admin work, making life easier, adding our mount to /etc/fstab so that it gets mounted automatically after a reboot. Add the following to your /etc/fstab file

bucket-name /target/folder gcsfuse rw,noauto,user,key_file=/path/keyfile.json

And thats it for today.  As usual hope this helps someone out there.

Email Logs in BigQuery

Email Logs in BigQuery

So Google has added an Admin feature where you can stream your email logs to Bigquery. Readup on it here: Email Logs in Big Query

Oh this is so nice for all of you out there that want to run reports on your emails. One catch to this is understanding the logs. After spending some time on this I would like to bring you a quick and dirty run through.

On the link I shared earlier you will find links to query examples and the Schema Documentation. I personally did not find the schema documentation very user friendly and had to spend some time working through the document and queries.

So to help you out on a quick rundown this is what I found:

Each email has multiple entries in the logs. So to do a count of all emails you will have to do a “count” with a “Distinct” on¬†message_info.rfc2822_message_id.


To go one step further, lets say you want to split your results into incoming and outgoing emails. For this we will have have to specify the message set type in our where clause.

(message_info.message_set.type = 8 or
message_info.message_set.type = 9)

The number mean the following:

  • 8 –¬†Total outgoing messages
  • 9 –¬†Total incoming messages

Now lets take this one more step further. Lets say you want to split that up to messages that is from outside your domain and those that is internal to your domain. For this we need to not just have a look at message_info.message_set.type but also message_info.source.service.

Firstly lets have a look at message_info.message_set.type. We know already what numbers 8 and 9 mean. So lets add to that list:

  • 1 – ‘Inbound email from outside domain’
  • 2 – ‘Sent outside domain’
  • 8 – ‘Total outgoing messages’
  • 9 – ‘Total incoming messages’
  • 10 – ‘Message internal to your domain’

As you can see there is only one entry for emails internal to the domain. So how would we know the difference between incoming and outgoing emails. This is where¬†message_info.source.service comes into play. To keep it simple I am only going to stick to 2 possible values of¬†message_info.source.service,¬†‘smtp-inbound’ and¬†‘gmail-ui’. As stated¬†smtp-inbound¬†is inbound emails, but on the other hand what does not state so clearly¬†gmail-ui means outbound emails ( Yea took me a while to figure that one out). So our where clause would look something like this:

(message_info.message_set.type = 2 or message_info.message_set.type = 10or message_info.message_set.type = 1 or
message_info.message_set.type = 8 or
message_info.message_set.type = 9)
(message_info.source.service = ‘gmail-ui’ or
message_info.source.service = ‘smtp-inbound’).

So to finish this off here is the SQL to a query to get all email counts from a single email address in your domain broken up into outgoing and incoming, internal to domain and external for the last 7 days.

count (DISTINCT message_info.rfc2822_message_id) as MessageIDCount,
WHEN message_info.message_set.type = 1 then ‘Inbound email from outside domain’
WHEN message_info.message_set.type = 2 then ‘Sent outside domain’
WHEN message_info.message_set.type = 8 then ‘Total outgoing messages’
WHEN message_info.message_set.type = 9 then ‘Total incoming messages’
WHEN message_info.message_set.type = 10 then ‘Message internal to your domain’
end as SetType,
WHEN message_info.source.service = ‘smtp-inbound’ then ‘Incoming’
WHEN message_info.source.service = ‘gmail-ui’ then ‘Sending’
end as EmailType,
sum (message_info.payload_size) / 1024 / 1024 as TotalSize,
Date(event_info.timestamp_usec) as QDate
FROM (TABLE_DATE_RANGE([2017_2_20_gmail_logs_dataset2.daily_],
(message_info.flattened_destinations like ‘’ OR
message_info.source.address = ’’ )
(message_info.message_set.type = 2 or message_info.message_set.type = 10
or message_info.message_set.type = 1 or
message_info.message_set.type = 8 or
message_info.message_set.type = 9)
(message_info.source.service = ‘gmail-ui’ or
message_info.source.service = ‘smtp-inbound’)
group by EmailType, SetType, QDate
order by QDate, SetType, EmailType

I hope this helps clear up some of the Google emails logs for some of you.