Jump to content

Soliant Consulting

  • Content Count

  • Joined

  • Last visited


1 Follower

About Soliant Consulting

  • Rank

Profile Information

  • Gender
    Non précisé

FileMaker Profile

  • Certification
    FileMaker 15 Certified Developer

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. FileMaker 18 introduces two new cryptographic functions which can be used to generate and verify digital signatures: Generate digital signature CryptGenerateSignature ( data ; algorithm ; privateRSAKey ; keyPassword ) Verify digital signature CryptVerifySignature ( data ; algorithm ; publicRSAKey ; signature ) Why Use Digital Signatures: An Example Digital signatures are a digital analog to good old-fashioned physical signatures which we’re all familiar with. Digital signatures serve a similar function as physical signatures, but they also provide additional benefits which physical signatures do not. Let’s look at an example. Back in 2011, I lived in a Chicago condominium building which had six units. In Chicago, the city handles waste removal (trash pickup) for buildings with four units or less, and larger properties are required to contract with a private waste management company. These contracts are frequently structured in a way that causes homeowners to overpay. Here’s one way how this is done: The contract is set up with a negotiated rate for three years. The customer agrees to the rate and signs the contract. Buried in the fine print is an explanation that the contract will automatically renew at a rate specified by the waste management company (which ends up being much higher than the original rate), and that canceling the automatic renewal must be done within a short time window that ends a month before the original three-year contract expires (probably to increase the chance that this is overlooked by the homeowner). Our condo association was stuck in one of these automatic renewals for a year, and as that extra contract year came to an end, we shopped around for a better deal. We eventually decided to go with one of the other waste management companies, and I signed a one-year contract with them. Our current waste removal company found out we were not going to renew, so they called and offered better terms and explained that if we extend our contract with them, it would supersede the new contract which I signed. As part of the back and forth with canceling the new contract, the new company sent me a copy of the contract that I signed, and I noticed a small difference between what they sent me and the copy that I had kept after signing it. They had removed the contract expiration date on their “official” copy of the contract. If we had stayed with them, and if I hadn’t kept a copy of the contract that I signed with them, they could have claimed that I signed a contract for much longer than one year. My copy of the contract with the end date specified in the “Terms of Agreement” section: Figure 1 – Original contract with end date. Their “Official” copy of the contract with the end date removed: Figure 2 – Contract with the end date removed The integrity of the contract had been compromised, and this is something that would not have been possible to do if I had signed the contract using a digital signature. If a digitally signed contract is tampered with, the signature is automatically invalidated. Before we look into how that works, let’s review all of the benefits provided with digital signatures. The Benefits of Using Digital Signatures Digital signatures can provide these three benefits: Benefit Definition Physical Signatures Digital Signatures Authentication If I sign something, others will know it was signed by me and not someone else, because my signature is uniquely mine and cannot be forged. Physical signatures afford this protection to a degree, but of course, forgeries are possible. Digital signatures are created using certificates issued by Certificate Authorities. With a digital signature, I cannot impersonate someone else, because the Certificate Authority will not issue a certificate to me unless I am able to demonstrate that I really am who I say I am. Non-repudiation If I sign something, I cannot later claim that the signature didn’t come from me. With a physical signature, I could try to claim that the signature was forged by someone else. With a digital signature, I cannot claim that I did not sign something while at the same time claiming that my private key is indeed private. Data integrity If I sign something, I want to feel assured that the contents of what I signed will not change later. As we saw with my contract example above, physical signatures do not provide this benefit. If a document is tampered with, the digital signature will not be verified. Cryptographic Hash Functions Before we explain how digital signatures work, let’s do a quick cryptography primer, starting with cryptographic hash functions. A hash function takes some data (for example, a document) as an input and outputs some other data of fixed size. The input is referred to as the message, and the output is referred to as the digest or hash. Cryptographic hash algorithms have these properties: A digest can be computed quickly. The original message cannot be derived from the digest. Even a small change in the message will result in a completely different digest. Digests always have the same length. Collisions (where two different messages create the same digest) are hard to create. For example, the digest of “hello world” (using the SHA-512 hash algorithm) is: b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9 And the digest of the 343-word lyrics of “Stairway To Heaven” is: 7b0f1b5dc9ce51751a4db89f3bd8f51456b65e3df260693e5e31883849193db2 Because collisions are unlikely, hash functions that obey the properties listed above can be used to generate a sort of unique identifier. If I use a cryptographically strong hash algorithm, it is essentially impossible for me to come up with a different message that generates the same digest as “hello world.” (We’ll revisit collisions a bit later.) Encryption An encryption algorithm takes some readable text (referred to as cleartext or plaintext) or binary data and converts it into unintelligible data. Decryption is the inverse: convert unreadable, encrypted data into intelligible content. There are two types of encryption: symmetric and asymmetric. Symmetric encryption uses the same secret key to both encrypt and decrypt. Asymmetric encryption uses a public-private key pair. One key is used to encrypt and the other to decrypt. Digital signatures take advantage of this property. The private key is used to generate the signature. This can only be done by someone in possession of the private key, so it’s critical that you keep your private key well secured. The public key is used to verify the signature. The public key can be published far and wide. A copy can be given to whomever asks for it without compromising the security of the signature. How Digital Signatures Work Generating the signature: A cryptographic hash algorithm is used to generate a digest of the message that you would like to have signed. That digest is then encrypted using an asymmetric encryption algorithm and a private key. The encrypted hash serves as the digital signature. An identifier for the hash algorithm that was used to create the digital signature is embedded in the digital signature itself so that when somebody verifies the signature, they know which hash algorithm to use. Verifying the signature: The original message, the digital signature, and a copy of the public key are obtained. The public key is used to decrypt the digital signature, revealing the plaintext digest and the identifier of the hash algorithm that was used to generate the signature. That hash algorithm is then used to generate a new message digest. The two plaintext digests are compared. If they are the same, the digital signature is considered to be verified. Security Principles Revisited Data integrity – If the message is altered in any way (even by just a single bit), the message digest will change too, and the signature will no longer be verifiable. This is what provides data integrity and what was missing in my experience with the waste management company. Non-repudiation – If I signed the message, I cannot later claim both (1) that I did not sign it and (2) that my private key is indeed private. Authentication – The identity of the owner of the private key is embedded in the certificate that contains the public key. To prevent others from impersonating him, the key owner can have the key signed by a Certificate Authority (CA). As part of this process, the CA will validate the authenticity of the key owner’s identity. Without the involvement of the Certificate Authority, you would have to take it on faith that the sender of the message is who he says he is. With the Certificate Authority signing the public key, you have to take it on faith that the Certificate Authority did its job properly in terms of validating the authenticity of the key owner. The job of deciding which Certificate Authorities to trust is generally delegated to the manufacturers of the major web browsers; i.e. Mozilla, Google, Microsoft, Apple, etc. Back in 2005, they established the CA/Browser Forum to ensure that certificate authorities follow best practices (a set of baseline requirements) when they issue certificates. Generating a Digital Signature Using FileMaker A digital signature can be generated using this function: CryptGenerateSignature ( data ; algorithm ; privateRSAKey ; keyPassword ) data – This parameter is the message that is to be signed. The value can come from a field or a variable and can be text or binary (e.g. container field). algorithm – This parameter specifies the cryptographic hash algorithm used to create the digest. Note that we get to choose the hash algorithm (used in step 1 of the signing process), but we don’t get to choose the encryption algorithm (used in step 2). The choice of the encryption algorithm is made for us. FileMaker has settled on using an RSA-based signature scheme, but there are others. privateRSAKey – This parameter is the private RSA key that will be used to encrypt the message digest. As a further security precaution, this private key can itself be encrypted with a password. This encryption is done using symmetric encryption (the same password is used to encrypt and decrypt the key) and creates an additional hurdle for a would-be hacker who may have obtained a copy of the private key but still doesn’t know what the encryption password is. keyPassword – If the private RSA key is encrypted, then the encryption password is provided in this parameter. Our choices for the cryptographic hash algorithm are the same as with the CryptAuthCode function: MD5 — do not use MDC2 — do not use SHA SHA1 — do not use SHA224 SHA256 SHA384 SHA512 Do not use MD5 or SHA-1, as they have been shown to have security vulnerabilities. MDC-2 may also be unsafe to use based on the information provided in this article. The CryptGenerateSignature function returns the digital signature as a binary file which can be stored in a container field. As a subsequent step, the digital signature file can also be stored as text using one of the text encoding functions; e.g. Base64Encode. If any of the parameters are invalid, the function will return “?”. Note that CryptGenerateSignature does not embed a timestamp inside the signature. This would have been nice to have so that the date and time of when the message was signed could be looked up as part of verifying the signature. One way of adding this to the signature yourself is to include the date and time inside the document that is being signed. Verifying a Digital Signature Using FileMaker A digital signature can be verified using this function: CryptVerifySignature ( data ; algorithm ; publicRSAKey ; signature ) data – This parameter is the message that was signed and whose signature needs to be verified. algorithm – This parameter specifies which cryptographic hash algorithm was used to create the signature. I am not sure why this parameter is needed since the algorithm is also specified inside the signature. publicRSAKey – This parameter is the public RSA key that corresponds to the private key that was used in step 2 of the signature generation. signature – This parameter is the binary digital signature file. This is the signature that we want to verify. The function returns 1 if the signature was verified or 0 if it was not. If any of the parameters are invalid, the function will return “?”. Authenticity? The CryptVerifySignature function will provide data integrity and non-repudiation, but not authenticity. The is because a public key on its own does not include any identifying information about the key owner. To do that, the public key would have to be included in a certificate signed by a Certificate Authority. These certificates are similar to the SSL certificates you may be familiar with. These signing certificates are known as “email and document signing certificates,” and the “common name” in those certificates specifies an email address instead of a domain name. If you are interested in getting one for yourself, you can purchase them at a discount at sites like this one: https://www.gogetssl.com/email-document-signing-certificates/. If you do end up getting a signing certificate, you can extract the public key from it using this OpenSSL command: openssl x509 -pubkey -noout -in <certificate_file> Note: OpenSSL is an open-source cryptography software library. It comes preinstalled on macOS and can be easily installed on Windows. Here’s a description of the command and the options used with the command: x509 — This command instructs OpenSSL to operate on a certificate file. -pubkey — This option specifies that the public key should be displayed. -noout — This option specifies that the encoded contents of the certificate itself should not be displayed. -in — This option specifies the name of the certificate file. (Replace <certificate_file> with your certificate’s file name.) You can see more details about this OpenSSL command here. Demo File If you would like to experiment with using the new FileMaker functions, take a look at the demo file. I have provided a sample public/private key pair so that you can get started right away. Get the demo file Collisions with MD5 I mentioned earlier that MD5, MDC-5, and SHA-1 should be avoided. These hash algorithms have known security vulnerabilities and should not be used to generate digital signatures. With MD5 in particular, it is possible to generate versions of the original and tampered documents that share the same MD5 digest in near instant time. In fact, to demonstrate this, my colleague Brian Engert did just that with the two images of the contract shown above. If you download those two image files and generate an MD5 hash for each one, you’ll notice that they are identical. Figure 3 – MD5 for Figures 1 and 2 You can read more about these vulnerabilities here: MD5: https://github.com/corkami/collisions SHA-1: https://shattered.io/ MDC-2: https://www.semanticscholar.org/paper/Cryptanalysis-of-MDC-2-Knudsen-Mendel/adab90fc06c2c14e91b9f7288929cdd2d5b16a52 How to Generate a Public/Private Key Pair If you want to generate a public/private RSA key pair to use for testing the CryptGenerateSignature and CryptVerifySignature functions, you can do so quickly and easily using the OpenSSL commands shown below. Just don’t forget that a public key by itself does not include any identifying information about the key owner, which means that, while you can achieve data integrity and non-repudiation by using the public key to verify a digital signature, you will not be able to confirm the authenticity of the signature. Here are the commands to use: Step 1. Generate an RSA private key: openssl genrsa -aes256 -out private.pem 4096 genrsa — This command instructs OpenSSL to generate an RSA key. -aes256 — This option specifies that the AES-256 cipher should be used to encrypt the private key. -out — This option specifies the name of the private key file that is to be generated. 4096 — This is the size of the private key. In this case, we are asking for a 4096-bit key. You will be prompted to specify an encryption password for the private key when you run this command. Step 2. Export RSA public key from private key: openssl rsa -in private.pem -outform PEM -pubout -out public.pem rsa — This command instructs OpenSSL to process an RSA key. -in — This option specifies the name of the private key file that is to be processed. -outform — This option specifies the format of the public key file that is to be exported. -pubout — This option instructs the RSA command to output the public key. -out — This option specifies the name of the public key file that is to be exported. You can see more details about these OpenSSL commands here: Generate RSA private key: https://www.openssl.org/docs/manmaster/man1/genrsa.html Export public key: https://www.openssl.org/docs/manmaster/man1/rsa.html Summary The CryptGenerateSignature and CryptVerifySignature functions make it possible to create and verify RSA digital signatures for files or text content. Consider using digital signatures if you have a use case where the transmission of data has to incorporate the data integrity and non-repudiation security principles. Verifying authenticity may be possible to do as well, although it would require some additional work to set up. Special thanks to Brian Engert for lending me his expertise in preparing this blog post. The post FileMaker 18: Digital Signatures Using CryptGenerateSignature and CryptVerifySignature appeared first on Soliant. Voir le billet d'origine
  2. Hello everyone! I have created a custom function for dynamically creating JSON arrays for related records in FileMaker, and I’d like to share it with you all. The custom function can be found in the sample file. Its full signature looks like this: #JSON_GetRelatedData ( objectName ; attributeList; relatedTO; relatedRecordCount; JSON; iterationRecord; iterationField ) Figure 1. Custom function for creating dynamic JSON Expand image It takes in several parameters, as described below: objectName – This will be what you want the parent object to be called, in my sample file I call it “RelatedModels.” attributeList – This is the field name you want to include in the JSON. This list needs to be return delimited. In my example, I am using "Name¶BodyStyle¶Year¶StartingPrice." You do need to have at least one value in this list, and it has to have a corresponding field in the related table. relatedTO - This is the Table Occurrence name; in this example it's simply "Model." If you're using anchor-buoy, it would look something like "MNF_MOD__Model." A good hint is to put what you see in the bottom left corner of the portal when you're in layout mode. relatedRecordCount - This custom function is recursive, so it needs to know when to stop. When you pass in the number of related records, then the CF will only go looking for those number of records. In my example, I am using Count( Model::ID) to determine how many models exist for the current manufacturer. JSON - leave this blank (empty quotes), it's used in the recursive iterations. iterationRecord - leave this blank (empty quotes), it's used in the recursive iterations. iterationField - leave this blank (empty quotes), it's used in the recursive iterations. My sample file has 2 buttons that use this custom function: Manufacturer and Model JSON and Model JSON. The Manufacturer and Model JSON button demonstrates how you would use this custom function in combination with other JSON data. The Model JSON button demonstrates how to use the custom by itself. An example input would be as shown in Figure 2: Figure 2. Example input Note: This example has the JSONFormatElements() Function around it which is not required but makes your results more human readable. And the example results will be as shown in Figure 3: Figure 3. Results from the example input. Get the Sample File Custom Function to Create Dynamic JSON Demo I hope that someone finds this useful! The post Create Dynamic JSON for Related Records Using This Custom Function appeared first on Soliant Consulting. Voir le billet d'origine
  3. Dynamically Create JSON Arrays for Related Records Download the custom function for dynamically creating JSON arrays for related records in FileMaker, built by Makah Encarnacao, Soliant Consulting Technical Project Lead. The file includes two buttons that use this custom function: Manufacturer and Model JSON and Model JSON. The Manufacturer and Model JSON button demonstrates how you would use this custom function in combination with other JSON data. The Model JSON button demonstrates how to use the custom by itself. Read the accompanying blog post. Complete the form to receive the demo file: Trouble with this form? Click here. The post Create Dynamic JSON for Related Records Demo appeared first on Soliant Consulting. Voir le billet d'origine
  4. Every summer, our team enjoys traveling to DevCon, FileMaker's developer conference. As a FileMaker Platinum Partner, we attend the conference to build on our skills, meet with clients, and share our knowledge. In fact, a handful of team members are presenting and sharing their insights with attendees this year: Advanced Training Bob Bowers, President Learn advanced techniques and strategies for creating high-performing, scalable FileMaker custom apps. Explore tools and techniques to make you a more efficient and confident developer. This session includes a combination of demonstrations and hands-on exercises. Topics include: Perform Script on Server, ExecuteSQL, virtual lists, Insert From URL, custom functions, Script Triggers, JSON, advanced scripting, calculations, and layout techniques. Demo files and resources will be available so you can continue to practice and learn on your own after class. Detective Boot Camp: Debugging Best Practices Makah Encarnacao, Technical Project Lead Debugging is like being a detective in a crime movie... where you are also the murderer” – Filipe Fortes In this session, we will explore how to set up your scripts for debugging success. We will take a deep dive into how to debug using FileMaker’s Script Debugger and Data Viewer. Finally, we will explore some common “crimes” and how to expose the culprit in each scenario. These examples will explore different concepts to enhance your debugging arsenal. After attending this session, you will have sharpened your detective skills, and hopefully dwindled your destructive tendencies. Big Or Small: How to Contribute to The Community Makah Encarnacao, Technical Project Lead You’re great at your job. You work hard; you solve problems day in and day out. You’ve learned a thing or two over the years, and you’d like to share your talents beyond your job - to contribute to the community. Perhaps you’d like to share your pearls of technical wisdom. Perhaps you’d like to be a great mentor - like the mentor you had early in your career. Perhaps you’d like to help with a social cause you believe in. In this session, we’ll talk about the different ways you can add to the FileMaker Community - and why you might find that you get more than you give. From Clutter to Clarity: Redesigning Layouts for Improved User Experience Mark Baum, Senior Application Developer Are your interfaces easy to interpret and intuitive to use, or are they so complex that they require explanation and special training? In this session, I'll take a series of cluttered real-world legacy layouts and show in time-lapse fashion how I would redesign them for an improved user experience. Along the way, I’ll discuss design principles and patterns. I’ll also demonstrate FileMaker-specific skills such as working with styles and themes, leveraging specific object properties to indicate interactivity, and customizing icons. I'll also show the same layouts as re-envisioned by several other designers within the FileMaker Community. Server Monitoring with Zabbix Mislav Kos, Senior Technical Project Lead & Wim Decorte, Senior Technical Solution Architect The removal of the Statistics chart from the admin console has left a lot of developers scrambling to come up with alternate ways of monitoring the health of their deployments. The free Zabbix software provides a way to do this and much more. You can track a variety of server metrics for one or multiple servers and configure alerts or other actions to be triggered by certain events; e.g. if free disk space falls below 10%. Come to this session to find out what Zabbix can do and how to set it up with your FileMaker solution deployment. Data Modeling - Hands-on Lab Martha Zink, Senior Technical Project Lead The choices you make when organizing your information determine what your app can show to the user and what your app can do. This lab will give you information and experience in creating a strong foundation for your app. You’ll start outside of FileMaker: learning key concepts about data modeling, planning out an app by identifying entities, and creating a diagram of the data model. Once you’ve done the planning, you’ll learn to translate that plan into FileMaker Pro Advanced, giving you a strong architectural foundation for your app. This lab will give you a chance to tackle data modeling hands-on, so bring your laptop, your brain, and your questions. Come Visit Our Booth We're sponsoring the conference again this year, so make sure to check out our booth too. You can meet our top developers and consultants. We're looking forward to the event! Will we see you there? The post Join Us at DevCon 2019 appeared first on Soliant Consulting. Voir le billet d'origine
  5. Web viewers first appeared in FileMaker 8 to allow users to display various forms of content. It didn’t take long for developers to figure out how to leverage them to display html and JavaScript, which greatly expanded their power. Now there are thousands of open source JavaScript libraries allowing users to visually represent data in seemingly endless ways. The Problems with JavaScript in FileMaker Unfortunately, there is a downside to this method of powerful, free data display. Implementing JavaScript libraries can be very difficult. It’s not impossible, but it takes a JavaScript expert to gather and assemble the proper resources and get everything to the full working state that you desire. Even though the concept is the same for each library -- find and load the necessary resources and then load your data -- they don’t all work the same. Besides the JavaScript library, you are trying to work with, you probably need additional JavaScript libraries to get your web viewer functioning properly. There’s also a good chance you need CSS resources. This leaves you with the decision of whether to load these libraries and CSS locally or to reference them. The difficulties don’t end there. Let’s say you’ve got your resources and data loaded, one way or another, and you go and test this in your application. You find your web viewer is a sad, empty box. How do you debug this? Making changes and re-testing relies heavily on the use of your clipboard and prayer. And if you do get it working and then want to make changes to your working web viewer, how do you handle that? More clipboard and more prayer. One wrong paste and you’ve lost the working version. You can use a text editor to store your “versions,” but that’s just clunky and confusing. Not that testing any of this is a piece of cake either. It’s difficult to work directly with the data in the web viewer. And again, you need to try something like saving the content to a text editor as html and then load that in a browser. This leaves you a few steps removed from FileMaker and your web viewer, where you actually want the JavaScript working. Carafe: A Free, Open-source Solution That’s why Soliant created Carafe, an open source package manager for implementing JavaScript libraries in FileMaker. It comes with a number of pre-written bundles like Google Maps and Data Tables. It’s simple to add these features to your application: Click the Deploy button on the bundle detail. Go to your solution and paste the script that is now in your clipboard into your script workspace. Add a web viewer to your layout. In the Configure portion of your imported script, set the $webviewerName variable to the object name of your newly created web viewer. Call the script with your data as JSON in the script parameter as described in the comments. Voila! You now have a web viewer displaying your data. This is where the fun really starts. Versioning Capabilities Carafe allows versioning, which helps users to try out changes without losing the critical working version. When you want to make changes to your bundle, use the Edit Bundle button, which opens your bundle in CodeSandbox. In CodeSandbox you can do all the testing you want by playing around with the JSON files and seeing the results. When you are satisfied with the changes, the Send to Carafe button will export your changed bundle back into Carafe with a new version number. Community Collaboration with Carafe You are not limited to the JavaScript libraries currently included in Carafe, either. You can add any JavaScript library. You can even share bundles with other developers in the community using a simple export/import of bundles from one developer’s Carafe to another. Simply clicking the Share button on your Carafe bundle creates a JSON export file which another user can bring into their Carafe file by using the import button. Getting Started in Carafe I hope you take the time to download Carafe, try it out, and join our community of developers building and sharing bundles. It’s free, easy, and pretty cool. Start using Carafe today. The post Intro to Carafe: The Open-Source FileMaker-JavaScript Solution appeared first on Soliant Consulting. Voir le billet d'origine
  6. Adding JavaScript to your FileMaker solution allows you to customize your functionality and do more on the platform. The opportunities are endless. While these capabilities have only recently started to become mainstream, it's evident FileMaker, Inc., has aligned behind the programming language. The company has launched native JSON functions, two robust JSON APIs, a Node.js server that ships with the product, and a pure JavaScript Admin Console. As a result, interest in JavaScript within the FileMaker developer community is growing. Unfortunately, working with JavaScript in FileMaker isn’t exactly straightforward. It takes a lot of time and effort to implement and maintain required specialized techniques. This deters many developers on the platform. After implementing JavaScript for our FileMaker clients, we quickly realized we had to find a better method. That’s why we built Carafe, a free, open-source project that streamlines the process of integrating and working with JavaScript in FileMaker. Announcing Carafe Carafe provides built-in package management, versioning of your implementation, simplified debugging, round-trip integration with JavaScript code editors. You can easily integrate and configure JavaScript libraries such as Datatables.js, Google Maps, rich text editors, an image gallery, calendar, charting, and so much more. A Free, Open-source Solution We want to share Carafe to help other developers customize their FileMaker implementations with JavaScript, so we’ve decided to make the solution publicly-available. It’s open-source, so you can download the files and make it your own. In fact, we encourage you to add to it and share your upgrades with others. Launching Soon Carafe is coming soon! Get the files first by signing up for alerts here. The post Carafe: The Fast & Free Tool to Implementing JavaScript In FileMaker appeared first on Soliant Consulting. Voir le billet d'origine
  7. Not ( Having a Computer Science Background ) The FileMaker developer community seems to consist of people with quite a wide range of backgrounds, much more so than is the case with developers from other programming languages and software development platforms. I suspect this is a consequence of the rapid application development (RAD) nature of the platform and its lower learning curve. This makes it possible for individuals to change course in their careers once they serendipitously encounter FileMaker and see how easy and fun software development can be. This diversity of backgrounds is a strength of the FileMaker platform, but one of the consequences is that many of the developers come to FileMaker without much formal computer science training. Boolean Algebra This blog post is intended to provide one small, tiny lesson that I recall from one of my computer science classes (taken long ago). The "lesson" has to do with Boolean algebra which describes the rules for how Boolean operators (AND, OR, NOT) are evaluated. One of my colleagues, someone who has taken up FileMaker development just a few months ago, recently asked me the following question: Can a long chain of Boolean operations like the one below be simplified somehow by extracting the negations (the "not's") out of the expression: not A and not B and not C and not D De Morgan's Laws To find out the answer, we have to learn about De Morgan's laws. There are two of them: not (A or B) = not A and not B not (A and B) = not A or not B With this knowledge under our belt, we can now rewrite the original expression as: not ( A or B or C or D ) The "Hide object when" calculation There are occasions when it's easier to think about when something should be false instead of when it should be true. An example of this is the "Hide object when" FileMaker calculation. For me, it's much easier to think about when an object should be displayed instead of when it should be hidden. The De Morgan's laws are very useful in these situations. For instance, suppose we have the following expression for when an object should be shown: ( not A or not B ) and ( not C or not D ) // display object I can negate this so that it can be used in the "Hide object when" calculation. not ( ( not A or not B ) and ( not C or not D ) ) // hide object = not ( not A or not B ) or not ( not C or not D ) = ( A and B ) or ( C and D ) Of course, we could have just kept it as this: not ( ( not A or not B ) and ( not C or not D ) ) // hide object I tend to go with whichever one I think will be easier to understand later on, in case I (or some other developer) have to return to this calculation down the road. References Boolean Algebra De Morgan's Laws If you have any questions or need help with your FileMaker solution, please contact our team. The post Boolean Algebra and De Morgan’s Laws appeared first on Soliant Consulting. Voir le billet d'origine
  8. FileMaker Templates are out! Our team here at Soliant was lucky enough to help build the files and tutorials, and I wanted to share our hopes for how you can make use of them. Three templates are available so far, and each one targets a different business model: Job Tracking, Event Management, and Memberships. Each template comprises a few parts: A core file Tutorials for how to add Build-On features A final file that includes all Build-ons The idea is that you begin with a core file and take it in the direction that suits your needs. This is a new effort that has a lot of different uses, depending on who you are. It’s like a Choose Your Own Adventure book, but with fewer deadly caves. For the Beginning Developer Templates are absolutely for you! You can poke around the core files to see what you can pick up about how they were developed. At the right of each layout, in Layout mode, you’ll see notes about the layout if we did something interesting or less-than-obvious. When you’re ready to add on to the core files, choose one of the Build-Ons your business needs. Then follow the tutorial for that Build-On, step by step. The tutorials look long, but don’t let that slow you down! They were written with enough detail for a beginner to be able to follow the steps and construct complete features in a FileMaker app. Begin by scanning the list of lessons in the tutorial, seen in the left-hand menu below. Some Build-Ons can be completed in one lesson, while others build up more complex functionality lesson by lesson. We broke out each piece of user functionality (AKA user story) into its own lesson, so that with each one you end up with something useful.For example, in the iPad Attachments Build-On for the Job Tracking Template, the complete work takes four lessons. First you give users iPad layouts, then let them add attachments, then view other attachments, and finally loop it all back to your FMPA users. Figure 1. iPad Attachments Lessons The tutorial Overview (always the first lesson in a tutorial) tells you what the Build-On will do, so you’ll know whether it does what you need, and can decide whether to build it. Figure 2. iPad Attachments Lessons Overview Once you’re in a lesson, take a moment to scan the Lesson Overview. This is a table of contents listing each major step in the lesson, and it’ll give you the big picture of what you’ll be doing. (“First I’ll add a table and some fields, then I’ll work on a layout, then I write a script…”). Figure 3. Lesson Steps When you’re ready, walk through each step closely. Before you know it you’ll be at the Review your work section watching your skills in action. For the Intermediate Developer The Templates offerings should be right in your wheelhouse. Consider the core file a solid jumping-off point: each core file was developed to the extent that it is usable as-is for basic functionality. However, it leaves room for additional work so you can customize it yourself. (The core files are more robust than Starter Apps, but less complex than Sample Apps.) Think of your own needs and get creative with the modules you add on! We wrote and commented on the scripts, so they’d be transparent enough for an intermediate developer to expand on without a ton of investigation. Have at it! If you want to add functionality described in one of the Build-On tutorials, you could follow that tutorial explicitly. Or, you could just scan it and follow the broad strokes. The tutorials are quite granular. However, we designed them so that if you don’t need that level of detail, you can scan them quickly to see the overarching steps within each one. Check out the Lesson Overview, scan through the screenshots for a quick view of the development, and then use your skills to quickly build the new functionality. If your build doesn’t work the way we describe the end result in the Review your work section, you can always go back through the instructions to see what we did differently. For the Advanced Developer Check out the files to see the development standards we used and consider adopting them for your own solution. These standards include passing parameters using custom functions, semi-modular scripting, and naming conventions. The theme itself went through a few rounds of vetting; you may find that useful for your own work. For more on the design choices the team made, check out Alycia McGuire's post. Whatever your skill level: you’ve got this! If you have any questions, please feel free to comment below or contact our team directly. We’re happy to help! The post FileMaker Templates: Are They Right for You? appeared first on Soliant Consulting. Voir le billet d'origine
  9. Our team worked with FileMaker to build several template files to help businesses launch specific functionality. These files encourage workplace innovation without requiring extensive development experience. File Foundation Built for You At the core of these templates, you’ll find a solid, consistent design, clean look, and strategic user interface and user experience. Look under the hood and you’ll find much of the same. That wasn’t by accident. Our team spent a great deal of time determining how these apps should look, how they should flow, and most importantly, how to develop them. Unlike starter solutions, these files are meant to be dissected, understood, learned from, and actually built upon. I’d like to go into some detail about some of the best practices we hit on as we worked through our design and development process for them. User Interface & User Experience Best Practices User experience depicts how the user perceives certain features of an app. Consider usability, efficiency, and the overall flow of the app. User interface describes the overall look and style of the app. We made our goal to build these solutions to feel and operate much like the apps we enjoy every day. Strategic Navigation For example, see how navigation between different layouts moves throughout the app. Rather than have a navigation bar on each layout, we took advantage of one of FileMaker’s newer features, card windows. We created a navigation layout that displays as a card window. Card windows, introduced in FileMaker 16, allow you to layer windows within the same window, a great feature to take advantage of when designing file navigation. It gives the user more space on each layout and eliminates layout clutter. It also allows for easier manipulation of the navigation menu. You only have to make changes to a single layout versus changing every layout that includes the navigation menu. We used a small button with the hamburger icon in the upper right corner of each layout with a script attached that opens the navigation layout in a card window. Because you have control over the size and position of card windows, you have the power to display the navigation layout anywhere within the window. We chose a slender window that appears on the far left on the screen. Using the Get (WindowHeight) script step, we are able to allow the navigation widow to always keep the same height as the active window. Using the Get (WindowWidth) and subtracting the navigation menu’s layout width (in this case 320pt), the navigation window will always open to the leftmost side of the window. The finished product is a clean, functional navigation window that only displays when the user needs, giving the main layouts a sleeker look and feel. Color Consistency Another important aspect of user experience we focused on when designing these custom solutions is the use of consistent colors throughout the app. We aimed for the workflow process to come as naturally as possible to the user, without confusion or hesitation. One way of doing is to guide the user through the workflow with specific colors. This means selecting colors for each different type of call-to-action, whether a button or clickable text. We kept the colors simple for these files and selected a single color for “positive” or next action steps, and then muted colors for “negative” or canceling action steps. For example, a user creating a new time entry in the Job Tracking file has two options for the user once they finish entering in time, “Close” and “Delete”. The natural action , “Close”, displays a brighter color for that button (in this case, blue). The user may still need the option of “Delete”, but it shouldn’t be called out to their immediate attention. Therefore we built a more muted color (for this case, gray). This part of the user experience carries throughout each of the core files and their additional build-ons. For any action we wanted to guide the user through or call an action to, we used blue for buttons and hover states. Any buttons or text we wanted to keep as an option but steer the user away from as the unnatural next action, we used a shade of gray. Design Themes Setting a design theme at the beginning of the development process allows you to easily apply a consistent design throughout the entire solution. You can use a custom theme, one of your own design. or an inherent FileMaker theme. It should include consistent colors, fonts, and styles set for each type of object. Setting a design theme at the beginning of the development process allows you to easily apply a consistent design throughtout the entire solution. We recommend choosing a single theme at the beginning of your design process and avoid making changes. This eliminates design inconsistency during development. For our team, this process took a few iterations, but we landed on a custom theme, with styles for all layout parts, shapes, buttons, portals, edit boxes, and more. Although this required a bit more work in the beginning, this saved time and stress as we worked through designing each layout. Every added object had a style, so we didn’t need to think about how to style each object or worry about ensuring we kept things consistent layout to layout. Scripts and Coding Consistency We also recommend setting scripting and coding standards at the beginning of a project. Define and name scripts, fields, tables, variables, etc. We spent significant time on this for the template files as well. Our team defined a set of standards and naming conventions before development started. Consistency tremendously helps testing, QA, and future rework needs. Similarly, you should strive for consistency in the organization of scripts and scripting itself. Folders organize your scripts and make them easy to find. Each script within the core files has a header section providing details for each script. This includes the purpose of the script, the context, parameters passed to the script, what the script returns, and any notes that might be useful for future users. This helps keep scripts clean and easy to read. It also gives anyone wanting to manipulate the script information to help them understand the script and its purpose. Commenting throughout the script also helps others understand the logic put behind each step or set of steps. When developing larger solutions or working with multiple developers, it’s easy to forget the thought process put into the script at the time. Having comments eliminates that “What was I doing here, again?” thought. The result is an end product that not only looks and operates professionally but is clean and well-laid-out under the hood as well. Global Variables We also consistently used the native FileMaker feature of global variables for manipulating layouts throughout our design. As I mentioned before, we used card windows throughout the files, not only for navigation but also for a way to quickly input or manipulate data for different records. For instance, in the Job Tracking template, we used a card window to input new and existing time entries. Instead of creating two near-identical layouts with only slight differences, for example, in the header name, we used global variables set in the calling script that are then used on the layouts as merge variables that display the appropriate header text for that action. The header for the two actions displays as, “Edit Time Entry” and “New Time Entry”, which in layout mode shows as “<<$$globalField>> Time Entry”. When the user triggers the script to add a new time entry, the global variable, <<$$globalField>> is set to the word “New”. The user then sees the card window displaying the layout with the header “New Time Entry”. This allows the use of a single layout for multiple actions. The same concept is used for each call to action button on these layouts. Buttons used to create new records will have different actions than buttons used to edit or delete records. Following this same Job Tracking example, the case would be editing existing time entries versus creating a new time entry. Each of these different actions requires different buttons. By using global variables, hiding conditions, and layering the buttons on the layout, you can continue to use the same layout for multiple actions. This keeps the design process simple and the number of layouts to a minimum. Master-detail Layouts The last native FileMaker feature great for enhancing both user interface and user experience that I’d like to share is the use of Master-detail layouts. This is new to FileMaker 17. You can now create portals that work with the a found set of the current table. This saves time as you no longer need to create self-join relationships and additional scripting to accomplish this. The resulting layout allows users to click through a list of records and see a record’s details all in one layout. This works great for displaying things like staff or client details. Using Master-detail layouts became standard for the core files. This takes them to the next level, both visually and functionally. Ensuring a Strategic User Experience in All FileMaker Solutions Taking time in the beginning to focus on consistent design and organization pays off in the end. It reduces your development time and ensures a professional looking final product. Our team follows these best practices in all of our development. If you would like our insights in taking your FileMaker solution to the next level, contact our team to learn more. The post FileMaker Templates: Give Your Users a Better Experience appeared first on Soliant Consulting. Voir le billet d'origine
  10. Do you miss the statistics chart? Figure 1. FileMaker 16 Admin Console with Statistics live view The admin console was rewritten in FileMaker Server 17, and the Statistics section was not carried over to the new version. But when you are dealing with server performance issues, having a live view of the FileMaker Server statistics can be critical. If your FileMaker Server runs on Windows, you can regain most of this lost functionality using the Performance Monitor tool ("Perfmon"). Perfmon Set Up Step 1. Add Performance Monitor snap-in to the Microsoft Management Console. Launch the Microsoft Management Console (mmc.exe). One way to do this is to click the Windows Start menu and type "mmc.exe". Go to the File menu, and select Add/Remove Snap-ins. Select "Performance Monitor", select "Add >", select OK, and select OK again. Step 2. Set up Performance Monitor with your desired counters. In the left pane of the Console window, go to Performance (Local) > Monitoring Tools > Performance Monitor. Figure 2. Left pane in the Microsoft Management Console Right-click in the blank space of the pane on the right side and select “Add Counters”. Figure 3. Select “Add counters” in the popup menu Add your desired counters. The Performance Monitor tool gives you access to all of the stats that were available in the Statistics section of the FileMaker Server 16 admin console and to a whole new set of Windows OS stats (called "counters"). If you're not sure which ones to add, try these: FileMaker Server 16 > FileMaker Clients FileMaker Server 16 > Remote Calls In Progress FileMaker Server 16 > Remote Calls/sec FileMaker Server 16 > Elapsed Time/call FileMaker Server 16 > Wait Time/call Processor > % Processor Time > _Total Process > % Processor Time > fmserver — The fmserver process is the main one, but also pay attention to the other ones starting with "fm", especially fmsase (server script engine). To see which counters FileMaker recommends tracking, see the "FileMaker Server for Windows Performance Monitoring" article. As an example of how to add a counter, to add the "FileMaker Server 17 > Elapsed Time/call" counter, find "FileMaker Server 17" in the list, click the down arrow to show the counters for FileMaker Server 17, select "Elapsed Time/call", and click Add. Figure 4. Adding elapsed time per call Fine tune how the counters are displayed. Make sure that the “highlight” option is enabled. This will make it easier to see the currently-selected counter. Figure 5. Ensure “Highlight Option” is enabled Set up the counters so that the ones you check most frequently are shown on the chart and the others are hidden (unchecked in the "Show" column). Pay attention to the scale. When the counters were added, the tool guessed at what the scale should be for each counter, but you may prefer a different scale setting. Here's how the scale setting works: If the % Processor Time chart shows that the value is mostly between 10 and 30, but the scale is set to 10, then the value is actually between 1 and 3. To change the scale, double-click on it or right-click and select Properties (or press control-Q). Here's how I set my counters up the last time I did this: Figure 6. Counter setup Step 3. Save Console configuration Go to the File menu and select "Save As..." Save to a folder that is accessible to everyone who will need access to this. For instance, I saved mine here: C:\PerfLogs\FMS 17 Perfmon Live View.msc Close the Microsoft Management Console. Accessing the "Live" View Double-click on the saved file to open the perfmon console. It will retain your last settings, including selected counters and their scale settings. Figure 7. Perfmon console in FileMaker 17 Perfmon Constraints, Limitations, and other Notes Last minute's data only This "live view" will only show the last minute's worth of data. To see the previous history, you can set up Perfmon to save the data. See the "FileMaker Server for Windows Performance Monitoring" article for instructions on how to do this. Admin rights required You will need to log into the server with an admin account to access Perfmon. Concurrent views Because accessing Perfmon requires you to connect to the server via RDP, having multiple people viewing Perfmon at the same time will require the server to maintain multiple Windows RDP sessions. It's best to avoid this, especially during times when the server is experiencing performance issues. So coordinate amongst your team to pick just one person who will have the Perfmon view open during times like that. Interpreting "% Processor Time" The percent value has a different meaning depending on which counter (processor or process) you are looking at: For the "Processor" counter: 100% means that all cores are completely utilized. If your server has 32 logical processors, then 3.1% (1 divided by 32) would be the equivalent of one fully-utilized processor. For the "Process" counter: 100% means that the process being monitored (for example, the "fmserver" process) is fully utilizing the equivalent of one logical processor (i.e. one core). 200% means that the process is utilizing the equivalent of two logical processors, etc. If you have any questions or need help with your FileMaker solution, please contact our team. The post Using Perfmon to View Live FileMaker Server Stats appeared first on Soliant Consulting. Voir le billet d'origine
  11. FileMaker introduced a very different kind of capability with the ExecuteSQL function. First introduced in FileMaker 12, it allows you to perform “select” queries against your FileMaker data, as opposed to the more familiar “Find” functionality available in FileMaker. SQL (Structured Query Language) is a standard used with other, more traditional, database servers. Using SQL, SELECTS on indexed data is blazing fast and efficient. It allows you to construct relationships in SQL that do not necessarily exist in your application otherwise. There have been many useful techniques that have been demonstrated since its introduction. This post will detail and explain yet another technique of the ExecuteSQL function - Search as You Type. The Search as You Type Technique In this example, we will attach a script trigger to a global field that will fire on every keystroke. The script will then perform an ExecuteSQL to build a list of IDs that relate to the table we want to show results from. Once we have a list of IDs, we can enter those in a global field, where it will act as a multi-key relationship to show related records in a portal. Sound easy? The real trick is dynamically building the SQL we want to use. However, as this is all handled by our script, you do not need to know SQL to use this solution. By getting a dynamic list of fields to search on, our search term entered can search against as many fields as we want. Finally, if we allow for entering multiple search terms, separated by commas, we can build a query that can look across ALL fields for multiple queries and narrow results as you type. An example screenshot is shown below. Figure 1 - Enter multiple search terms separated by commas Expand image Step One: Building the Field List To start, we need a list of fields to search on. Fortunately, you can interrogate the internal FileMaker tables used to reference schema. For example, the following SQL will return results for all the table occurrences that appear in your relationship graph. SELECT * FROM FileMaker_Fields The kind of find we want to perform works best on Text type fields, as we will use the LIKE operator to find results. The equivalent field type in SQL is “varchar”. To return only a list of Text fields in your database, you include the clause “WHERE FieldType = ?” and give it a parameter of “varchar”. We also restrict our search by looking for only fields where the “FieldClass” is equal to “Normal”. That leaves us with a list of text fields that exclude any global fields or summary fields. This technique as shown in the sample file does not handle fields defined with repetitions. You could add support for that, but generally, you should avoid repeating fields in data. In our sample code, we abstract out the table name to make it easily portable. To modify in your own solution, update the table to target by updating the variable named $get.tablename. Step Two: Building the Search Request Now that we have a list of fields we want to search in, we can build the list of corresponding parameters we need for the ExecuteSQL function. By building the expression used for the ExecuteSQL function, it is a little easier to build the SQL statement using variables and then use the Evaluate function to perform it. Then, we define a corresponding search parameter for every field being searched on. As a result, we populate two variables with values: $this.fields and $this.params. Additionally, since queries run in SQL are case sensitive, we will make all search requests lowercase both in SQL, using the LOWER function. We also use the Lower function in FileMaker. By using LOWER in SQL, you also prevent FileMaker from automatically indexing all fields being searched on. The only field we need returned is the ID field, which is the primary key in our table. Once we have a list of primary keys, we can temporarily store those in a global text field and relate it to our target table to create a many-to-many relations and show results in a standard portal. Finally, with the ExecuteSQL expression constructed, we can run it with the Evaluate function. Step Three: Multiple Search Parameters Since we build the SQL expression dynamically, we can also add support for entering multiple search parameters. In technical terms, we want to perform an “AND” query for every search parameter entered. A comma is defined as our search delimiter. For example, if we want to find all records that contain “Chicago” and also contain “IL” in any field, we can type “Chicago, IL.” The query returns all fields that are LIKE %chicago% AND all fields that are LIKE %il%. Get the Search as You Type Using ExecuteSQL Sample File You can use the following sample file to examine the code and modify for use in your own solution. The changes required to point the SQL to a new table in your solution are minimal: simply change the variable that is set for “$get.tablename” to your own table occurrence name and update the relationship and portal. https://github.com/SoliantMike/FM-ExecuteSQLAsYouType Special thanks to Mislav Kos for reviewing and suggesting several improvements. References ExecutSQL documentation If you have any questions or need help with your FileMaker solution, please contact our team. The post Search as You Type Using ExecuteSQL appeared first on Soliant Consulting. Voir le billet d'origine
  12. One of the biggest risks in undertaking a big software development project is underestimating the time and effort involved from start to finish. Our team has provided estimates for hundreds of projects, and sometimes we’ve missed the mark, especially in the early days of our business. Over the years, we’ve put together a strong model for estimating how much time and effort a project is likely to require; our employees undergo annual training on the method. Using these methods ensures we present our clients with the most useful estimates possible and effectively set expectations for each project. Not Just Simple Math When you think of hours estimation, you probably imagine a small team of people putting together a list of tasks and then assigning hours to each. From there, it’s just simple addition, right? From the outside, I’m sure that’s what the estimation process seems like. Behind the scenes, however, things work much differently. Building Uncertainty Ranges No two projects are exactly the same. Even if you feel like you’re developing the same solution, there are too many variables at play to borrow from another experience to build your estimate. Comparing one project to another can be good for getting an order-of-magnitude or “ballpark”-level estimate. However, you wouldn’t want to make firm commitments based on just a rough comparison. Adding Non-Development Factors Not all of the effort in a project is spent by developers. Our project management team works hard to ensure communication remains open and transparent and keeps projects running smoothly. They need time to accomplish their tasks. Especially going into a new client relationship, no one knows how much time and effort that will take. Some of our clients appreciate a daily update email and have few questions. Others like detailed explanations on how things work behind the scenes. Projects also vary widely in how much support they need; for small projects with a single developer, a project manager may stay a little behind the scenes, making sure the developer has what she needs and sending periodic status updates. In larger projects with multiple developers, multiple client contacts, and a multi-step testing process, the project manager is heavily involved every day. He or she leads scrums and meetings, facilitates communication and monitors overall project risks. Besides project management, an estimate should include time for meetings, both internal and external, for quality assurance activities and for work necessary for deployment, to name some of the major non-coding categories. An estimate that fails to budget time for activities like testing, internal and external communication, and project management, may only come up slightly short for a smaller project. However, it will leave a major gap for a large effort. Foundation Phase Estimates If a project requires up-front design and discovery work (and most do), you also need to account for business analysis time. During our Foundation phase activities (also known in the industry as “discovery and design”), our business analysts work with clients to uncover their biggest challenges, opportunities, and goals and then build a plan to address the highest-value features and capabilities during solution development. As we wrap up design and discovery, our developers build a more accurate estimate of hours based on their lessons learned and the emerging solution blueprint. (In fact, we often provide mid-foundation in addition to post-foundation estimates to help put our clients at ease and promote the transparency required for successful projects. Each one gets more and more accurate and therefore encourages better client communication.) Consider Task Size When estimating items, smaller is definitely better. Most of us can’t reliably estimate pieces of effort that are bigger than a day or two in size. If your estimate has an item that says “manage invoicing, 20-50 hours,” you’re going to be well-served by breaking that into smaller units of functionality. This can include “Create Invoice,” “Print Invoice,” “View List of Invoices,” and so forth. If the project has several types of invoice, each with its own print format, then you can probably break “Print Invoice” into smaller items, one for each invoice type. If it’s too soon to break that item maybe it’s too soon to estimate it; or, if 20-50 hours “feels” right, just be aware that you’re estimating based partly on intuition. Make sure your estimate has a range that allows for the fact that there’s more definition yet to do. Number of Estimated Items The number of estimated items matters too. The fewer tasks you’re combining into an estimate, the less meaningful your estimate is. Nobody estimates items right on the nose. We’re either over or we’re under, for any given item. With a large number of items, we hope that the overs and the unders balance out. Too few items, and your estimate risks being derailed by a poor estimate for a single item. We abide by the rule of ten: estimates containing fewer than ten items are very dubious. However, the farther over 10 items you go, the more stable your math gets. For best results, try to have no fewer than twenty. Size of Estimate Ranges Just as important is for all of your estimate items to fit within a certain range of sizes. Huge disparities between your biggest and smallest estimate items suggest either that the small items have been broken down prematurely, or the big items are not yet broken down enough, adding uncertainty to your estimate and the project as a whole. As a rule of thumb, we try to have a range of no greater than 6x between the smallest and largest items in an estimate. Incomplete Tasks Lists Estimation is all about finding something to count; your estimate is only valid to the extent your list of work items is complete. If you’re missing 20% of your tasks, it doesn’t matter how good your estimates for the rest are. You can’t possibly know how many hours you will need for what’s not there. This is why our team prefers to make multiple estimates and only feels confident making commitments based on an end-of-foundation phase estimate. Once we’ve gone through a thorough foundation for a given body of work, we feel pretty confident that there are no major items that are completely missing from our list. Using the Right Tools To help our team estimate projects more effectively, we built a tool to put these ideas into practice. It uses some simple statistical math to create a master estimate out of a collection of work item estimates. These estimates could be generated by a single developer. Or, in a project with multiple people doing different kinds of work, the estimates could come from separate estimators. The tool can merge the estimates of multiple estimators. Just remember that, as with all estimation, the best estimates are created by the people who will actually do the work. We believe successful businesses are built on efficient solutions, so how could we not build one for ourselves? We call it our Three-Point Estimator. What is the Three-Point Estimator? To provide our clients with accurate estimates, we built a custom solution for our projects. It starts with a set of line items representing individual work items; usually, these are user stories, though occasionally they represent necessary technical tasks. Estimators give each item three different estimates. One number represents the “expected” or “most likely” case — if everything goes reasonably well it should take about this long. The other two numbers are a best case (if things go perfectly) and worst case (I can’t imagine it taking me longer than X). The algorithm then measures the span of this range. Wider ranges indicate more uncertainty in the estimate, while smaller ranges indicate more confidence. Finally, the algorithm combines the line items estimates into an overall estimate. Just like the individual line items, the overall estimate is a range. Generally, we don’t consider overly optimistic scenarios. We focus on project outcomes that, according to our tool, have a 60-90% chance of occurring. Sharing optimistic scenarios that only have a 1 in 3 chance of coming true, say, doesn’t serve the project well. Our final project estimate, therefore, considers not only the sum of the starting task estimate ranges but also factors in the uncertainty surrounding each item. This paints a clearer picture of what the project could require. Of course, we are always working toward more concrete numbers. The more work we do on a project, the closer we get to accurate numbers. This is why we often have a project-start estimate but provide a much tighter estimate following our Foundation phase. We’ve learned enough about the work through extensive conversations with the clients and end users to minimize uncertainty in our estimated ranges for hours of work required. Building Your Own Estimates During your project estimation, I recommend keeping these principles in mind: Your biggest risk is not in mis-estimating an item on your list — it’s in having no estimates at all for items that aren’t on the list, but should be. Good estimates start with lists of at least 20 distinct items. Estimated items shouldn’t vary too widely in size. Budget for non-development activities like meetings, interviews, project management, user acceptance testing, training. Then have a group meeting for a frank discussion around uncertainty in the project. Consider big gaps in estimates and what you need to know to reduce them. Then either track down that information or discuss and plan for best and worst case scenarios. Good luck with your next project estimate! Let us know if you have any questions in a comment below. The post Project Estimation Best Practices: A Look at Our Three-Point Estimator appeared first on Soliant Consulting. Voir le billet d'origine
  13. What Are Script Parameters? Script parameters are one of many tools that all FileMaker developers should learn to utilize when developing custom FileMaker applications. Numerous benefits include writing fewer scripts, increased code reusability, and improved coding organization. By designing your scripts to pass parameters, you can use a single script with logic branching (i.e., IF, ELSE IF, ELSE) to perform multiple actions within your FileMaker solution. For example, you can create a Sales Order record in your FileMaker solution. You can create a Sales Order from a few different contexts in your solution, such as from a Customer record, from a previous Sales Order, or from a cancelled Order. Often there are similar requirements when creating a Sales Order from each of these contexts. Maybe you need to know the corresponding customer for this order, or maybe you need to know shipping address. By passing script parameters you can use a single script to create a Sales Order from various contexts without having to write and maintain a separate script. There are many ways to format and pass scripts parameters in FileMaker and in this blog post we are going to start from the beginning and work our way up to the various ways we can pass multiple script parameters. This blog post borrows heavily from Makah Encarnacao's FileMaker DevCon 2018 presentation, so be sure to check out her video, slides and example file for more details. Single Script Parameter FileMaker provides one primary method to pass script parameters into a script. This is done via the "Specify Script" dialog available throughout FileMaker. Adding a script parameter is done by simply typing into the "Optional script parameter" field at the bottom of the Specify Script dialog. Anything you type into this field will default to quoted text, unless FileMaker detects that you have entered a specific field or function. One can also click the "Edit..." button to the right of the "Optional script parameter" field, to open a calculation dialog and provide a more complex script parameter. In most of our Multiple Parameter examples, we will be using this calculation dialog to determine the parameters we will pass. Figure 1 - Optional Script Parameter Once we pass a single parameter through the "Optional script parameter" section, we can retrieve that value in our script using the Get(ScriptParameter) function. We will use this function throughout this blog post. Multiple Parameters with Return Delimited Lists Sometimes, one may find that a single script parameter is all they need to send to a script, but often, as we develop more complex solutions, we find the need to pass multiple parameters into our scripts to yield the desired result. The simplest way to send multiple script parameters is by using the pilcrow/return character (¶) or List () function to pass our script parameter. For example, in our Specify Calculation dialog we may enter the following parameters to pass information about a specific manufacturer in our solution: MNF__Manufacturer::Name & ¶ & MNF__Manufacturer::Headquarters & ¶ & MNF__Manufacturer::NetWorth & ¶ & MNF__Manufacturer::DateFounded This will allow us to pass four lines of data from a Manufacturer record to our script. Inside our script we can separate each of these four lines into their own local variables by using the GetValue() and specifying each line in it's own "Set Variable" script step as shown in Figure 2. Figure 2 - Pass parameters with a return delimited list This method of passing multiple script parameters does have potential drawbacks. For example, if any of the fields from a record that you pass as a parameter has a return character in it, used when passing an entire paragraph as a parameter, it can potentially throw off your "Set Variable" script steps. This is because line returns are preserved in a field as part of parameter passing. There are ways around this drawback by capturing those return characters and converting them to another less common character. It's also worth noting that the order of script parameters must match the variables that you set in the ensuing script steps. This can lead to the wrong data being set into the wrong field, in turn, leading to difficult to troubleshoot bugs. What other options do we have that may help prevent these shortcomings? Multiple Parameters with Pipe Delimited Lists By not using the return character, we can use other less common characters, like the Pipe character (|), to delimit our script parameters. In this coding example, we replace the ¶ character with the | character as seen below: MNF__Manufacturer::Name & "|" & MNF__Manufacturer::Headquarters & "|" & MNF__Manufacturer::NetWorth & "|" & MNF__Manufacturer::DateFounded With this method, we streamline the passing of parameters, but we need to use a few more functions on the variable declaration side to properly parse out the various values contained in our script parameter. It is fairly straightforward to get the first and last value from the script parameters using the Left() and Right() functions. These functions allow us to use the position of a specific pipe character to determine when we should start and end the character parsing. Here is an example of returning the first value in the script parameters: Left( $parameters; Position ( $parameters ; "|" ; 1 ; 1 ) -1 ) /*Subtract one because we don't want the pipe itself to be part of the name*/ As we can see, this technique is more advanced and would require an understanding of the Left (), Right(), Middle() and Position() functions to retrieve multiple parameters. However, every developer should learn to utilize these powerful functions within FileMaker,within their custom applications. We have other methods to pass multiple script parameters, many which are more elegant than using the Pipe delimiter. Multiple Parameters with Let Function The Let() function is a native FileMaker function that often is a mystery box for new developers. Using the Let() function, you can not only make your complex calculations more readable and modifiable, but can also declare local and global variables inside it. Therefore, we can use the Let() function to pass parameter values AND also the variable names! This is a super powerful function indeed! See the below example of passing multiple script parameters using the Let() function: "Let( [ $name = MNF__Manufacturer::Name; $headquarters = MNF__Manufacturer::Headquarters ; $netWorth = MNF__Manufacturer::NetWorth; $dateFounded = MNF__Manufacturer::DateFounded ]; $name )" In the above code, we are passing our Let() function inside quotes to preserve the coding until we want to evaluate the parameters inside it. The first part of the Let() function is contained with square brackets "[]" and is where we can declare the local variables for each value. Each line inside this declaration is separated by a semicolon. Think of the semicolon as the end of a sentence in a declaration. When this Let() function is passed with quotes, we are able to wait to declare our variables when we are inside our called script. We can have FileMaker analyze this Let() function be using the Evaluate() function. This simply takes the parameter passed to the function and evaluates the text inside it. We can pass simple mathematical equations or other functions, like we see in the example in Figure 3: Figure 3 - Pass parameters with the Let() function With one script step, we are able to create multiple local variables for our script! See how the Let() function is super powerful? Use it! There is a drawback about this method of script parameter passing. The parameter that you pass is pretty verbose, and the potential for typos is higher. Typing the wrong variable name in your Let() function can have unexpected consequences. Make sure to test your script with various parameters in this method. Multiple Parameters with Custom Function An alternative to the Let() function includes using a pair of custom functions to declare and then set your script parameters to local variables. This allows predefining your script parameters as variables but with a simpler syntax and fewer script steps by using the "#Assign()" custom function and its related custom function "#". Visit FileMakerStandards.org to learn about these custom functions in detail. Here is an example of how to pass script parameters using the "#" custom function: # ( "name" ; MNF__Manufacturer::Name) & # ( "headquarters" ; MNF__Manufacturer::Headquarters ) & # ( "netWorth" ; MNF__Manufacturer::NetWorth ) & # ( "dateFounded" ; MNF__Manufacturer::DateFounded ) This custom function uses "name-value" pairing. The value on the left side becomes your variable name in the ensuing script; the value on the right becomes your variable value in the script. Once we pass our parameters in this format, we simply call the "#Assign()" custom function with the Get(ScriptParameter) function, as shown in Figure 4: Figure 4 - Pass parameters with a name value pairs function The result of this is multiple variables based off the "name-value" pairing that we defined in our script parameter. This method does require importing custom functions into your solution. It also requires careful entry of name values to ensure variables are used correctly in your script. However, overall, this method provides an elegant and speedy way of passing script parameters and setting your script local variables Multiple Parameters with JSON FileMaker introduced JSON functions natively in FileMaker Pro with version 16, but custom JSON functions have also been around for some time. JSON stands for JavaScript Object Notation. You can store data in a similar name-value pair, as described in our custom function option above. With JSON, we can use native FileMaker functions to pass multiple script parameters using the JSONSetElement() function. The added power with JSON is nesting related data inside a JSON object. Think of this as the ability to send not only data from the current record you are viewing in FileMaker, but also related data, such as order line items or customer contact records. This allows for larger sets of data transportation in the well know JSON data format. See this example of multiple data sets in a single script parameter declaration: JSONSetElement ( "" ; ["name" ; MNF__Manufacturer::Name; JSONString]; ["headquarters" ; MNF__Manufacturer::Headquarters; JSONString]; ["netWorth" ; MNF__Manufacturer::NetWorth; JSONNumber]; ["dateFounded" ; MNF__Manufacturer::DateFounded; JSONString]; ["relatedModels" ; JSONSetElement ( "" ; ["model[0].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 1 ) ; JSONString]; ["model[1].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 2 ) ; JSONString]; ["model[2].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 3 ) ; JSONString]; ["model[3].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 4 ) ; JSONString]; ["model[4].name" ; GetNthRecord ( MNF_MOD__Model::Name ; 5 ) ; JSONString]; ["model[0].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 1 ) ; JSONString]; ["model[1].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 2 ) ; JSONString]; ["model[2].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 3 ) ; JSONString]; ["model[3].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 4 ) ; JSONString]; ["model[4].body" ; GetNthRecord ( MNF_MOD__Model::BodyStyle ; 5 ) ; JSONString]; ["model[0].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 1 ) ; JSONString]; ["model[1].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 2 ) ; JSONString]; ["model[2].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 3 ) ; JSONString]; ["model[3].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 4 ) ; JSONString]; ["model[4].year" ; GetNthRecord ( MNF_MOD__Model::Year ; 5 ) ; JSONString] ); "" ] ) To set each script parameter as a variable, we can also use a custom function that works similar to #Assign () function but for the JSON format. Figure 5 showns an example using the JSONCreateVarsFromKeys () custom function: Figure 5 - Pass parameters using JSON If we don't want to use a custom function, we can use the JSONGetElement() function individually to grab each name-value pair. The custom function route takes very few lines of code, but the native function provides individual script steps that may aid in debugging. Another aspect to account for with using the native JSON functions is not all data types between JSON and FileMaker match up perfectly. For example, the date format that FileMaker uses is not directly supported in JSON, and we have to pass date values as a string. Here is a useful chart to see how each data type corresponds between the two formats: FileMaker Pro JSON Match Text String Number Number* Date String or Number Time String or Number Timestamp String or Number Container String Object Array Null Passing Containers Up to this point, we have described how to send parameters in some sort of text format, whether the data type is date, time, string or number. But what about binary data stored inside a FileMaker native container field? There are a few ways we can send container fields as a script parameter, and I will describe the drawbacks to each. The first method to transport container data via a script parameter is to use the native FileMaker Base64Encode() and Base64Decode() functions. The Base64Encode() function takes container data and converts it into a large block of text to make it easier to transport for script parameters or other data destinations. To reverse the Base64 encoding, use the Base64Decode() function to store back in it's native FileMaker data format. See this example of passing as Base64 encoded text: Base64Encode ( MNF__Manufacturer::Logo ) To reverse the process, we use the base64 decode function as shown in Figure 6: Figure 6 - Use the base64 decode function Unfortunately, some file metadata is lost in translation when Base64 encoding your container data, causing loss of important information in the encoding/decoding process. For example, a JPEG image going through this process loses information related to creation, modification, latitude, longitude, among other metadata. In some development situations, this is not an acceptable result. The alternative is to temporarily move the container field into a global container field. From there, set your destination field to the value stored in the global container. See an example of this shown in Figure 7: Figure 7 - Set the destination container to the global Conclusion As we can see, what started as a simple passing of one parameter to a script can be quite complex and varied. This should not dissuade one from using script parameters. Their benefits are numerous and can take your FileMaker development to the next level. By using script parameters, you can increase their development productivity and make it easier to streamline various functions within your database with less code. I hope that laying out the above options provides an informative matrix to navigate the possibilities that work best for your development practice and lead you to learn more about native and custom functions in FileMaker Pro. Have fun passing your parameters! Resource Makah Encarnacao's DevCon Presentation - FileMaker Community Questions? Leave a comment below or if you need help with your FileMaker solution, please contact our team. The post Passing Script Parameters: A Complete Summary from Simple to Complex appeared first on Soliant Consulting. Voir le billet d'origine
  14. FileMaker Portal Columns I was recently asked to sort portal columns for a client, and I figured there has to be a newer and cooler technique out there to accomplish portal sort than when I did it last. I reached out to the other FileMaker developers at Soliant, and I got a great sample file from Ross Johnson. He shared a really cool technique with me, crediting mr_vodka (sounds like a fun guy!). Read mr_vodka’s original post here. For my client, I was also asked to filter the portal and batch update columns. The end product came out pretty cool, so I decided to create a sample file with all these techniques put together in one file to share with the FileMaker community. The data in my sample file is from Mislav Kos' post: Test Data Generator Figure 1 - Portal with filters and batch update columns Expand image Get the Demo File Download the demo file to follow along with the instructions outlined below. Sort Here’s the step to complete the portal sort. You’ll need to use the sample file to copy and paste some components. Copy the field “zz_portal_sort_c” into your solution. You’ll need to copy it into the table on which your portal is based on. Open the field definition for zz_portal_sort_c. The only update you’ll need to make to this calculation is set the let variable “id” to the primary key of your table. For example, if your primary key is something like “_kp__ContactID” you’ll need to have “id = ¶ & _kp__ContactID & ¶ ;” (see Figure 2) Figure 2 – Set the let variable ID Expand image NOTE: Be sure this calculation returns a number (see Figure 3), that’s very important! Figure 3 – Select “Number” for the calculation result Next, copy script “sortRecords ( field { ; object } )” into your app. You’ll need to update line 51 and change “ID” in the executeSQL statement to use your primary key field for the table on which your portal is based (see Figure 4) Figure 4 - Update Line 51 in the script Expand image You should also update line 6 (the header information) to reflect that you added this script to the file, and the date you did. Back in your layout, update your portal to sort by the new field you just added to the table. Figure 5 - Update your portal sort Expand image Name the portal object “portal”. If you prefer a different object name, it can be anything you’d like, but you’ll need to update almost all the scripts for this demo to reflect the new portal object name. Figure 6 – Name the portal object You can now assign the script to your labels with a parameter of: “List ( GetFieldName ( <table>::<field> ) ; "portal" )”. I also added some conditional formatting to the label to turn the label bold when the column is sorted. Additionally, as a visual cue that the sort is bidirectional, I added an up and down arrow for each column and assigned a conditional hide to it. You can copy and paste the buttons to use in your solution, and then update the hide calculation to use your fields. And that’s it! Once it’s all set up, sorting for each column should work. One thing I want to note: this method assumes that the relationship is one-to-many. I tried it using a Cartesean join, and it broke the sort. I haven’t tried anything more complicated than a one to many. Filter Columns Filtering each column allows the user to do an “AND” search in the portal, which means that your user can filter on multiple criteria. If you used a single search bar to filter, then it is considered an “OR” search. To be honest, I haven’t researched if there’s a better technique out there. This method made logical sense to me when I wrote it, and lucky for me it worked. If you know of a better approach to use, I’d love to hear it; please leave a comment below. Here are the steps to complete this filter technique: Create global fields for every column you’d like to filter. Figure 7 - Create global fields Place those fields on the layout Figure 8 - Place the fields on the layout Expand image Add the script “Trigg_CommitRefresh” to your script workspace and then assign that script as a trigger to the filter fields with a trigger type of OnObjectExit. This script trigger will only commit the record and refresh the portal every time a user exits a filter field. In this case, gender is a little different; it uses an OnObjectModify. You’ll learn why gender is different a little further down in this post. Now we update filter calculation for the portal. You can copy the code from the filter calculation into your portal calculation and then update it in your file to match your fields. The filter calculation is a single let statement that has four parts: Define “AllClear”, which is a flag that checks if all the globals are empty Define which filters have any text in them. In other words, which filters are being enacted by the user Define each filter result at the record level. If the user entered text to filter, does the current record pass that filter, and therefore returns 1, or that record getting filtered out and returns null? Finally, we compare our results. If AllClear is true, then always show the record (return 1). Otherwise, let’s count up how many filters the user is trying to complete, and count up how many columns pass the filter for the given record. If these two sums match, then the record passes the filter check. If not, then the current record has been filtered out. You’ll need to update the following for this calculation to work in your file: The globals you’d like to filter within the “All Clear” definition The filter check section: Filter<FieldName> = not IsEmpty (<Table>::<FilterGlobal> ) The filter result section: Filter<FieldName>_R = If( Filter<FieldName>; PatternCount ((<Table>:: <FieldName>; <Table>:: <FilterGlobal>)> 0) NOTE: You’ll notice the gender result is a little different, see item V. below which explains why. The results comparison will need to be updated: If( AllClear; 1; Filter<FieldName1> + Filter<FieldName2>….. = Filter<FieldName1_R > + Filter<FieldName2_R>….. ) Gender Difference: For most of these filters, I’m using the patterncount() function because I want to include partial matches. However, with gender, if I searched for “male,” I would always get male and female results since the string “male” is inside the string “female.” Since in this case there are only two options, I turned the filter into a radio button so that I don’t have to worry about partial word entries and now I can make a complete word comparison in the calculation. That’s why gender does not use patterncount() and instead uses “=” to see if the filter and the value are identical. Batch Update The batch update feature goes hand in hand with filtering – the user will filter the data and then perform a batch update. When completing this feature, I figured there are two ways to accomplish it: go to related records in a new window and perform a replace field contents, or loop through the portal itself. I decided to loop through the portal because I liked that you don’t have to leave the current window. However, both methods would accomplish the same goal, and if you have a lot of related records to be updated, the replace field contents might be a little faster. But for a typical use case, looping through the portal works well. To complete the batch column update, you’ll need to copy over the script “BatchUpdate (field)” into your file. If you haven’t already, you’ll need to name your portal object “portal” for this script to work. You should also update the history in the script header to communicate that you added this script to your file and when you added it. I recommend duplicating line 3 and then adding your name/email, the current date, and a note about how you copied this script into the file. The rest of the script is ready for use. If you’d like, you can customize the dialog in step 9. Now you’ll need to add the batch buttons to the layout. Your button will call the BatchUpdate script you just copied over and will pass the parameter of the field you’d like to update, in quotes. That’s the summary of how these three features are set up in the sample file. I hope you find it useful. Figure 9 – Button setup Questions? Leave a comment below or if you need help with your FileMaker solution, please contact our team. The post Portal Column Techniques: Sort, Filter, and Batch Update appeared first on Soliant Consulting. Voir le billet d'origine
  15. Take Your Portals to the Next Level FileMaker portals are invaluable for showing related data; there are several techniques for enhancing their functionality. Our demo contains three methods for adding the ability to sort, filter, and batch update portals. Follow along with our step-by-step guide and get started with expanding the functionality of the portals in your FileMaker solution. Complete the form to receive the demo file: Trouble with this form? Click here. The post FileMaker Portal Columns: Sort, Filter, and Batch Update Demo appeared first on Soliant Consulting. Voir le billet d'origine
  • Create New...