This article discusses common testing methods used to match a pressure-sensitive client with the best possible products for their application.

In the world of pressure-sensitive adhesives (PSAs), there is no shortage of physical property data. Peel strength, shear strength, loop tack and temperature information are all readily available. The question is: How useful is all of this data?

The short answer: Not very.

As a PSA provider, Adchem spends hours coaching its sales staff to understand the application and the conditions to which a PSA is subjected before we can select a product from our portfolio.

We are commonly asked for products equivalent to competitors’, or those that offer a specific property. But before we can make an assessment, we need to know what application the product will be used in. Only then can we assist the customer in finding the technically and economically optimal solution.

The problem is, people love numbers. Customers sometimes would rather compare numbers in the comfort and privacy of their office than interact with the vendor and divulge “secret” information. By divulging information to us, they worry that they’ll lose control over their project when, in fact, the opposite is true. By providing comprehensive, accurate information, we can best help them find the solution they need.

Test Procedures

Before we examine different properties and factors that can affect the numbers, the source of the information and the test procedures themselves should be discussed. The most common tests are those sanctioned by the Pressure Sensitive Tape Council (PSTC) and those developed by the American Society of Test Methods (ASTM). In addition, some vendors report results from their own in-house tests. Large customers - particularly in the automotive, fenestration, and electronics markets - require myriad test procedures to qualify. When comparing numbers, it is important to know who conducted the tests and the methodology used. Comparing numbers from two different test methods is like comparing apples to oranges.

How the numbers are generated is equally important. Did the vendor perform the testing, or was an independent outside testing facility used? The American Association of Laboratory Accreditation (A2LA) has established standards for laboratories that include equipment calibration, technician training, laboratory environment and record keeping. For example, A2LA has certified Adchem to ISO 17025, the standard for independent laboratories. Data published by Adchem’s A2LA accredited laboratory is accurate and free from subjective commercial influences. Adchem has invested heavily in its A2LA accredited laboratory, whose scope of accreditation includes common tests run on pressure-sensitive tapes.

Types of Tests

We will look at several common properties and the factors that influence their numbers. Different adhesive chemistries will influence the test results; the most common include dwell time (the time the PSA has been in contact with the substrate), the nature of the substrate, and the type of backing used for the test specimen.

Loop tack is a common measure of “quick stick,” or how fast the PSA will adhere to the substrate. A tensile-type machine lowers a loop of the test tape onto a substrate - typically stainless steel - and makes contact for a specified period of time, typically one second or less. Then it is pulled away with force, usually measured in pounds per inch. When looking at loop tack numbers, it is important to know the machine speed as well as the backing being used, as these factors will influence the final number.

“Thumb appeal,” or the time-honored practice of using finger pressure to predict the quick-stick properties of a PSA, must also be addressed. This test does not produce any comparable numbers, and its reliability is questionable.

Peel numbers, usually expressed in pounds per linear inch, are impacted by the speed of the test apparatus; the dwell time or the elapsed time the adhesive has been in contact with the test surface following application; the backing material (PET film is common, but higher numbers can be generated by the use of dead-soft aluminum foil as a backing material); and the chemistry.

Figure 1 shows the impact of both peel angle and backing material for a typical acrylic PSA. Obviously, if one does not know the test conditions, comparing the numbers is useless. Rubber-based systems will generally exhibit higher peel numbers than acrylic-based systems. PSA manufacturers typically report immediate peel numbers, since this test procedure is routinely performed as part of the company’s QA process. Unfortunately, these will be the lowest numbers one would see on a system, and they seldom relate to actual use.

Figure 2 shows the impact of dwell time on peel strength, with different backing materials and substrate types. Here it is obvious that some conditions show little impact with dwell time, while for others the impact is quite dramatic. There are many ways to generate numbers, so it is important that the conditions under which they were produced be well understood.

Static shear testing is done by hanging weights onto a sample of PSA and measuring the time to cohesive failure (the adhesive film splits). Adhesive failure (when the adhesive film cleanly delaminates), would not be considered a measurement of shear strength of the adhesive system. Shear testing may be done at room temperature or at an elevated but constant temperature. Variables to consider before comparing shear testing results include the weight and the area of the test sample. Lighter weights are sometimes used at higher temperatures. Tests are often discontinued if the sample is still “hanging” after seven days and reported as “7+ days.” One way to discern differences is to test to failure of the first candidate.

Dynamic shear testing defines the strength of the adhesive as measured by the force required to generate a shear failure. In this case, the units are force, or lbs./in2, rather than time. Since the rate of application of the force is the determining factor, it is critical that the rate be known or reported as part of the test results.

Service temperature is a frequently misunderstood test. The service temperature is primarily a factor of the adhesive itself, and is typically reported as constant and intermittent. Intermittent results always need further definition. Is it five minutes at the elevated temperature and five minutes at the reduced temperature, or is it five days at each temperature? Vastly different results can be expected. Other factors that influence service temperature include the substrate that is bonded, other ambient conditions, such as how much humidity is present, how the PSA was applied, and the amount of dwell time preceding the test.

Another number that customers commonly use to compare is the SAFT, or shear adhesion failure temperature. The test sample preparation is identical to shear testing, but the test temperature is raised in increments until the sample fails. The temperature reported bears little relationship to the conditions of an actual use situation. All of the variables affecting service temperature come into play for the SAFT test as well. When comparing these numbers, be sure the test conditions were identical.

Comparing Test Results

Comparing numbers on vendors’ data sheets is nearly impossible, given the many ways testing and test samples can differ. In addition, such comparisons rarely have any bearing on how a product works in a given application. The best evaluation process involves side-by-side testing of all PSA candidates. Our history of side-by-side testing sometimes reveals interesting differences between published data and our own test data. This is not to say that manufacturers are publishing bad data; they are selecting test methods and conditions that put their products in the most favorable light, and they have every right to do that. It just makes it all the more difficult to compare the numbers.

Comparing products in actual use conditions and/or environment involves time and effort, but is infinitely better than comparing by numbers alone. As mentioned, developing tests that more accurately predict the performance of the PSA in its final application have more value for the customer.

For example, we created the 90

Links