This article provides a comprehensive overview of cobalt-base alloys, focusing on their applications as wear-resistant, corrosion-resistant, and heat-resistant materials. Cobalt demonstrates exceptional utility in applications requiring magnetic properties, corrosion resistance, wear resistance, and elevated-temperature strength. The primary emphasis centers on cobalt-base alloys for wear resistance, representing the largest application area. While cobalt serves extensively as an alloying element in nickel-base alloys for heat-resistant applications, this discussion explores the fundamental properties of elemental cobalt and the diverse applications of cobalt-base alloy systems in industrial environments.
With an atomic number of 27, cobalt occupies a position between iron and nickel on the periodic table, exhibiting a density of 8.8 g/cm³ that closely resembles nickel. The thermal expansion coefficient of cobalt falls between those of iron and nickel, making it suitable for applications requiring dimensional stability across temperature ranges.
Cobalt demonstrates interesting crystallographic behavior depending on temperature conditions. At temperatures below 417°C, cobalt exhibits a hexagonal close-packed structure, while between 417°C and its melting point of 1493°C, it adopts a face-centered cubic structure. This structural transformation contributes to the unique properties of cobalt-base alloys under varying thermal conditions.
The mechanical properties of cobalt include an elastic modulus of approximately 210 GPa (30 x 10⁶ psi) in tension and about 183 GPa (26.5 x 10⁶ psi) in compression, providing excellent structural integrity for demanding applications.
Beyond forming the foundation of cobalt-base alloys, cobalt serves as a critical ingredient in numerous industrial materials. Paint pigments represent the single largest use of cobalt, utilizing its distinctive coloring properties and chemical stability.
In nickel-base superalloys, cobalt typically comprises 10 to 15 weight percent, providing solid solution strengthening while decreasing the solubility of aluminum and titanium. This reduction in solubility increases the volume fraction of gamma prime (γ') precipitate, enhancing the overall strength characteristics of the alloy system.
The role of cobalt in cemented carbides involves providing a ductile bonding matrix for tungsten-carbide particles. Commercially significant cemented carbides contain cobalt ranging from 3 to 25 weight percent, with cutting tool materials commonly utilizing 3 to 12 weight percent cobalt for optimal performance.
Cobalt's natural ferromagnetic properties contribute to resistance against demagnetization in permanent magnet materials, including aluminum-nickel-cobalt alloys containing 5 to 35 weight percent cobalt, iron-cobalt alloys with approximately 5 to 12 weight percent cobalt, and cobalt rare-earth intermetallics that exhibit some of the highest magnetic properties among known materials.
The artificial isotope cobalt-60 serves as an important gamma-ray source in both medical and industrial applications, demonstrating the versatility of cobalt across diverse technological sectors.
Cobalt-base alloys can be characterized as materials exhibiting exceptional wear resistance, corrosion resistance, and heat resistance, maintaining strength even at elevated temperatures. These properties arise from several key factors: the crystallographic nature of cobalt and its response to stress, solid-solution-strengthening effects from chromium, tungsten, and molybdenum, the formation of metal carbides, and the corrosion resistance imparted by chromium content.
The selection of cobalt-base alloy compositions depends on specific application requirements. Softer and tougher compositions find use in high-temperature applications such as gas-turbine vanes and buckets, while harder grades provide superior resistance to wear in demanding industrial environments.
Many commercial cobalt-base alloys trace their origins to the cobalt-chromium-tungsten and cobalt-chromium-molybdenum ternary systems first investigated by Elwood Haynes in the early 20th century. Haynes discovered the high strength and stainless nature of binary cobalt-chromium alloys, subsequently identifying tungsten and molybdenum as powerful strengthening agents within the cobalt-chromium system.
Upon discovering these alloys, Haynes named them Stellite alloys after the Latin word "stella" meaning star, referencing their distinctive star-like luster. Recognizing their exceptional high-temperature strength, Haynes promoted the use of Stellite alloys as cutting tool materials, establishing a foundation for modern wear-resistant applications.
Contemporary cobalt-base wear alloys remain remarkably similar to Haynes' early formulations, with the most significant improvements relating to the control of carbon and silicon, which were impurities in the original alloys. The primary distinctions among current Stellite alloy grades involve carbon and tungsten contents, which directly influence the amount and type of carbide formation in the microstructure during solidification.
Carbon content significantly affects hardness, ductility, and resistance to abrasive wear, while tungsten plays an equally important role in these properties. The chemical composition of Stellite alloys typically includes chromium at 25-30%, molybdenum at 1% maximum, tungsten ranging from 2-15%, carbon from 0.25-3.3%, with iron, nickel, silicon, and manganese maintained at low levels, and cobalt comprising the balance.
Wear phenomena generally fall into three main categories: abrasive wear, sliding wear, and erosive wear. The specific type of wear encountered in a particular application represents a crucial factor influencing the selection of appropriate wear-resistant materials.
Abrasive wear occurs when hard particles or hard projections on a counter-face are forced against and moved relative to a surface. The distinction between high-stress and low-stress abrasion relates to the condition of the abrasive medium after interaction with the surface. High-stress abrasion results from the entrapment of hard particles between metallic surfaces in relative motion, while low-stress abrasion occurs when moving surfaces contact packed abrasives such as soil and sand.
In cobalt-base wear alloys containing hard phases, abrasion resistance generally increases with the volume fraction of the hard phase. However, abrasion resistance is strongly influenced by the size and shape of hard phase precipitates within the microstructure and the characteristics of the abrading species.
Sliding wear represents perhaps the most complex wear mechanism, not in concept but in how different materials respond to sliding conditions. This type of wear occurs whenever two surfaces are forced together and moved relative to one another, with damage potential increasing markedly when both surfaces are metallic and little or no lubrication is present.
The gas turbine industry has historically been the predominant user of high-temperature alloys, with aircraft gas turbines requiring elevated-temperature strength, thermal fatigue resistance, and oxidation resistance. Land-based gas turbines, typically burning lower-grade fuels and operating at lower temperatures, prioritize sulfidation resistance as the major concern.
Modern applications of high-temperature alloys have become increasingly diversified as industries seek greater efficiency from fossil fuel and waste combustion and develop new chemical processing techniques. Cobalt-base high-temperature alloys typically contain chromium at 20-23%, tungsten at 7-15%, nickel at 10-22%, iron at 3% maximum, carbon at 0.1-0.6%, with cobalt comprising the balance.
Although cobalt-base alloys are not as widely used as nickel and nickel-iron alloys in high-temperature applications, they play an important role due to their excellent sulfidation resistance and strength at temperatures exceeding those at which gamma-prime and gamma-double-prime precipitates in nickel and nickel-iron alloys dissolve.
While cobalt-base wear-resistant alloys possess some resistance to aqueous corrosion, they face limitations from grain boundary carbide precipitation, depletion of vital alloying elements in the matrix following carbide or Laves precipitate formation, and chemical segregation in cast and weld overlay materials.
Wrought cobalt-base high-temperature alloys, characterized by homogeneous microstructures and lower carbon contents, demonstrate superior aqueous corrosion resistance compared to wear-resistant grades but still fall short of nickel-chromium-molybdenum alloys in corrosion performance.
To address industrial requirements for alloys exhibiting outstanding aqueous corrosion resistance while maintaining cobalt's inherent advantages of wear resistance and high strength across wide temperature ranges, several low-carbon, wrought cobalt-nickel-chromium-molybdenum alloys have been developed. These specialized alloys typically contain chromium at 20-25%, tungsten at 2%, molybdenum at 5-10%, nickel at 9-35%, iron at 3% maximum, carbon at 0.8% maximum, nitrogen at 0.1% maximum, with cobalt comprising the balance.
This comprehensive approach to cobalt-base alloy development ensures that specific industrial applications can benefit from optimized material properties while maintaining the fundamental advantages that make cobalt an essential element in advanced engineering materials.
Total Materia Horizon contains physical, thermal and electrical properties for hundreds of thousands of materials, for different temperatures, and much more.
Get a FREE test account at Total Materia Horizon and join a community of over 500,000 users from more than 120 countries.