torsdag 1 augusti 2013

Automo-mail

Information is the key to getting people involved. Too much can cause disinterest and even prevent involvement. Informing the right people also takes a lot of planning.

Three groups of people need to be informed.
1. End user - registered owner
2. Manager for 1
3. Last logged user

Each user can handle a single mail for each computer in question, but a department manager could be very irritated to get 30 emails that don't necessarily require action. The mail to the manager should have a few multidimensional arrays. The matrix displayed will quickly inform the manager about each computer and its status. Owner, location, last user, model, and date and time of exchange should be available.

To get the information, the query should collect the matrix based on manager mail address (select id, [fields] where manager_id = [string]). An update on the same objects should then be made, filling in mail_to_mgr as true so it's not possible to send duplicates of the information (update ... where mgr_id...).

That's simple enough. Now how to send a bunch of mail to a bunch of different people without clicking one at a time... Once again, multidimensional array. Each first level array passed to the mail function. Perhaps all this and the managers' lists can be done at the same time, polling the managers' lists of clients and passing them to the manager email then polling the users list and pumping the mails to them... It's been suggested that we contact the managers to see if there is a time which works best for them. That means we're going to adjust ourselves to 72 different managers... Something says that won't work so well. We are however looking at too many computers.. I forgot to filter those which are collected. :(

Report of progress here will come later.

Update 18-Aug:
The multidimensional arrays worked out very well. Each computer has a database field for each recipient. As soon as the recipient has been informed, the field is updated with the date and time the mail was sent. The mail function will send to the list collected by the database which excludes those records where the mail has already been sent. This enables me to send an individual mail to those who are manually configured and a collective email with dynamic values for all the rest based on different criteria. I chose to base the emails on the day we scheduled the exchange. This sends mail to each manager for each day. That splits up the load and keeps the users and managers informed on a date basis instead of a bulk informational email. It also allows us to be flexible with the rollout if a crisis should come that prevents us from keeping the schedule. We will easily be able to push all remaining plans down one day if users are not both planned and informed.

Week two starts tomorrow. So far only one complaint on Tuesday last week when the manager got an email the day before the exchange since we decided there really was no reason to wait until Wednesday to roll. The information since that day has gone out two or three business days in advance with no further complaints. We have Wednesday with no plan yet, but I'm planning on possibly moving Thursday to Wednesday after we know what Monday's results are. Fridays are no good day to roll out computers, so it's possible to move Friday to Wednesday. We'll see how the team feels.

torsdag 25 juli 2013

Deployment confirmation

One of the least mobile things in the deployment is the information about who gets what and where. We can't possibly send the information by carrying around a bulky laptop when we deliver computers, so I need a mobile interface that allows me to input data and submit it to a database which will keep the deployment updated.
Google sheets can be the right thing for this since it allows access outside the intranet. The only problem is getting the right order connected. That's not such a big issue of I can connect the information to a predefined link from my page to Google which fills the fields I have information for and gives me quick access to update the parts I don't have information for. Easy solution.

Update:
This worked great. Using importrange in a couple places to get the data from the submission through storage reports and back out to the customer turned this into a live solution in a matter of a few hours. Updates do take about five minutes to populate the tables throughout the whole system though.

Update 18-Aug:
One bug that was relatively easy to fix: when comparing the number of results returned by the query to the database, it would sometimes run an integer comparison against the strong value, which prevented any results from showing. Using intval fixed the issue.

The only other reported problem is that sometimes Google forms freezes and forces a restart to the process. Didn't happen to often though, so no action was taken there.

Feedback was also given that it should be possible to see the status of the computer being reported so we prevent multiple entries. That is under construction with spreadsheets API and hopefully will lead to further development in the use of spreadsheets for storage control.

Link a dynamic appointment in a dynamic email

This was quite a success. I just need to link a CSV file to the page so it creates the emails and allows me to push them out in batches.. keeping the record of which people have received what mails and when and what content the mail contained.. easy with an identification number for each row.

torsdag 18 juli 2013

Random thoughts of the Find

Input comes from CSV files, so I need to be able to parse them correctly into the database. Dumping them in works for now, but it's not going to hold in the long run because I need to be able to use validation to make the database understand the data. I also need to be able to update the data that already exists from each dump... That seems like it might actually take a lot of time to code, but I'm going on the hunch that it will save hours of validating data in the future.
I suppose I could use the existing dump of multiple CSV files and just dummy the input from each one where the information from the file exists in another table.
Either way, I need to look into how to use the mysql function together with file uploads to release the need to log into the database to input the data.